Of bot armies, brute force, admin and WordPress

Hello,

I put in some work this weekend looking at WordPress and reports around the bot army attacking WordPress installations.  Our WordPress install is a complicated beast and that in itself, largely, may protect us from the more simple attacks.  I knew about the possible threat late Friday but family commitments prevented me from looking earlier than 5.30pm on Sunday.  In theory, I shouldn’t be working then and should be spending time with my family but this is important.  Later, we’ll have practices in place to limit the amount of time we spend working on emergencies out of hours.

Initially, I panicked.  I thought I should shut off access to the service.  There was no one to talk to about the issues of leaving the service up or taking it down.  With my partner away for the weekend I didn’t have time to do something rash.  If I switched off the service I would be doing the hackers work.  If I didn’t I risked having to re-install the service.

So, on Sunday I started reading around the problem.  The first few blogs I read around the problem talked about third party pay-for WordPress security modules.   This blog lists some things we can do to protect the service.  Cloud Flare are, effectively, selling a service around WordPress security.  That lead me to look at mod_security for Apache which allows us to ‘patch’ an application without changing the application’s code.  Web application code changes take time.  Then I came across  blogs, e.g. codex.wordpresss.org that explains the current attacks.  The account this attack goes after is ‘admin’.  Big phew!  When WordPress is installed it lets us change the the admin user to anything at all.  Good practice.

With that known and the pressure off I decided to look at black listing functionality for WordPress.  I happened across Better WP Security which has a lot of features and 5 stars from 1300 WP admins.  It looks good but not the sort of thing you can install on a Sunday on a WordPress install as complicated as ours without going through the university’s change advisory board CAB.

With the idea of keeping things simple I started looking at the access_log.  I could see repeated attempts by some IPs to URLs including wp-login.php, /register/, wp-admin, site.  I checked those IPs against Google to see if any security web sites reported them as suspicious and active.  What I didn’t want to do was blacklist IPs from genuine users.  I added this to the root .htaccess:

<FilesMatch ".*$">
order allow,deny
allow from all
deny from x.x.x.x
deny from y.y.y.y
deny from z.z.z.z
</FilesMatch>

This code tails the access_log looking for those URLs:

tail  -f access_log|egrep '404|/register/|wp-login'

today an email popped up talking about an attempt to create a new site using wp-signup.  We have site registration switched off.  This could be important in protecting our service.

This discovery lead me to change my script to rank accesses and discover the major bots knocking on the door:

 cat access_log|egrep '404|/register/|wp-login|site|wp-signup'| \
   awk '{ ips[$1]++ }END{for(i in ips){print ips[i] " " i;}}'| \
   sort -n

One bot has had 52696 attempts.  That this went unnoticed says something about how we work with WordPress (and other web applications.) Popular software will be targeted and we need generic tools to discover the attacks.

Well, that’s it.  I hope it helps.

Posted in CELT, Library, Linux SysAdmin, The Commons, The Commons | Tagged , , , | 1 Comment

First install of Project.net on Linux

From Project.net :

Project.net for Mission Critical Project Portfolio Management

Project.net is a complete Project Portfolio Management (PPM) solution designed to capture, display, report on, and resolve the complex interrelationships organizations tackle when planning and executing major initiatives.

I’m going to publish this regularly so that interested parties can see the progress.

I have been given a VM running CentOS 6.3 with 60GB of disk, 4GB of RAM and 1 CPU.  Of course, as the service gets taken up this can be scaled up.  I’m following the document for installing on Linux.  The following is a list of steps taken including the gotchas:

  • yum update
  • yum -y install rsync
  • setup server for SSH without passwords
  • yum -y groupinstall “Web Server”
  • mkdir -p /usr1/home
  • useradd -d /usr1/home/www -c ‘Web Server’ www
  • chown -R www:www /var/lib/dav /var/cache/mod_proxy /var/cache/mod_ssl
  • All links to Oracle XE 10g lead to 11g.  This could be a problem.  An admin on a forum states that 11g will work.
  •  yum -y install libaio-devel bc<—- !!!!
  • unzip the oracle archive and rpm -ivh it
  • make sure /etc/hosts has you server defined in it
  • /etc/init.d/oracle-xe configure
    • Oracle Application Express likes port 8080 (so does Tomcat, hmmm.)
    • HTTP Port 8081
    • Listener Port 1521
    • Haz database!
  •  Pull Project.net archive to /usr/local/src
  • mkdir /usr/local/projectnet
  • unzip archive to /usr/local/projectnet
  • find /usr/local/project.net/database/ -type f -exec dos2unix {} \;
  • edit …9.2.0/new/pnetMasterDBBuild.sh to reflect install
  • “If you are using Oracle Express set the PNET_BUILD_DB_DATABASE_NAME variable to the value XE
  • cd /usr/local/project.net/database/create-scripts/versions/9.2.0/new/
  • run the script, sip something nice, sip something nice…
  • tail -f /tmp/pnet_test_db_build.log the script pauses waiting for input.  The script doesn’t prompt for answers or supply default values.  I was checking the install log when I noticed it stop and ask a question.  I hit return in running script window.  Fingers crossed. (10pm)  Just checking the log now (9.25am) There are errors such as:
  • install java jre 6.0.x rpm
  • alternatives –install /usr/bin/java java /usr/java/jre1.6.0_37/bin/java 6037
  • pull jce_policy-6.zip to /usr/local/src
  • cp jce/*.jar /usr/java/jre1.6.0_37/lib/security/
  • pull apache-tomcat-6.0.35.tar.gz to /usr/local/src
  • pull apache-activemq-5.7.0-bin.tar.gz to /usr/local/src
  • useradd -d /usr1/home/projectnet -c ‘Project Net’ projectnet
  • su – projectnet
  • tar xf /usr/local/src/apache-tomcat-6.0.35.tar.gz
  • edit .bashrc to reflect CATALINA_HOME and JAVA_HOME
  • edit ./apache-tomcat-6.0.35/conf/tomcat-users.xml change passwords
  •  cp /usr/local/project.net/lib/mail.jar /usr/local/project.net/lib/activation.jar ~/apache-tomcat-6.0.35/lib/
  • cp /usr/local/project.net/lib/jdbc/ojdbc14.jar ~/apache-tomcat-6.0.35/lib/
  • mkdir ~/apache-tomcat-6.0.35/endorsed
  • cp /usr/local/project.net/lib/endorsed/* ~/apache-tomcat-6.0.35/endorsed/
  • edit apache-tomcat-6.0.35/conf/server.xml and change port 8080 to 9090
  • edit catalina.sh to reflect production system with -Xms256m -Xmx1024m
  • logging : using Log4j 1.2.9 and commons-logging-1.1.1-bin.tar.gz
  • create /etc/init.d/tomcat
  • chkconfig tomcat on
  • yum -y install apr-devel openssl-devel ant
  • pull jdk-6u37-linux-x64-rpm.bin to /usr/local/src/
  • alternatives –install /usr/bin/javac javac /usr/java/jdk1.6.0_37/bin/javac 6038
  • alternatives –config javac
  • build APR in /usr1/home/projectnet/apache-tomcat-6.0.35/bin/tomcat-native-1.1.22-src/jni
  • ant and ant jar in /usr1/home/projectnet/apache-tomcat-6.0.35/bin/tomcat-native-1.1.22-src/jni
  • Adjust JAVA_OPTS to reflect -Djava.library.path=/usr/local/apr/lib/
  • edit bin/linux-x86-64/activemq to reflect ActiveMQ home
  • edit bin/linux-x86-64/wrapper.conf to reflect ActiveMQ home in set.default.ACTIVEMQ_HOME and set.default.ACTIVEMQ_BASE
  • ln -s /usr1/home/projectnet/apache-activemq-5.6.0/bin/linux-x86-64/activemq /etc/init.d/activemq (as root)
  • chkconfig –add activemq
  • service activemq start (check data/wrapper.log)
  • For project.net edit conf/context.xml to connect to Oracle and a mail server.
  • As projectnet : cp /usr1/local/project.net/app/pnet.war ~/apache-tomcat-6.0.35/webapps/
  • mv ROOT ../ROOT.webapp
  • mv pnet.war ROOT.war
  • make sure passwords are correct in webapps/ROOT/META-INF/context.xml ./conf/Catalina/localhost/ROOT.xml conf/context.xml
  • (as root) /etc/init.d/tomcat restart
  • Haz Project.net application!
  •  Prepare Apache to be the port 80 frontend
  • create /etc/httpd/conf.d/pm4s.conf :
  • # tomcat integration
    ProxyPreserveHost On
    ProxyPass / ajp://localhost:8009/ min=5 ttl=120 keepalive=On ping=1
    ProxyPassReverse / ajp://localhost:8009/
  •  Needs SSL set up.  Done but needs proper SSL cert.
  • Configure Project.net
  • Change password and some other details
  • Setup up docvault to be in ~projectnet/docvault
  • Setup up Sys.Settings to reflect /usr1/home/projectnet install
  • Additional:
  • keystore (for LDAP cert) for java needs creating and tomcat needs to run with keystore arguments.
  • Installed licence key.  Every user must be given the key before registration
  • Redirect http to https.
Posted in Library, Linux SysAdmin | 1 Comment

oEmbed YouTube videos

This is a quick test demonstrating the built in oEmbed code of YouTube in WordPress. Before this feature was added we had to rely on plugins like WP YouTube Player and YouTuber. We should be able to embed videos and size them, now, using WordPress’s core code.

Let’s go…

It should be as simple as pasting the ‘Share’ URL on a line by itself:

http://youtu.be/IIlKiRPSNGA

You should see the video embedded in the post or the page.

This code will size the embedded video:

[embed width="230" height="200"]http://youtu.be/8gpjk_MaCGM[/embed]

To get the correct URL for the oEmbed URL click on ‘Share’ for the video on the YouTube website. As time goes on, it is likely that plugins that don’t offer anything over this functionality will fall out of date with the core WordPress code.

And the conversation continues:

https://twitter.com/MohamedKassam/statuses/256024744602779648

Mohamed Kassam (mind bending Twitter homepage) pointed out that oEmbeded YouTube video can be extended using the parameters outlined on the YouTube Embedded Players and Player Parameters page.  I haven’t been able to prove this works.

 

Posted in CELT, Library, The Commons, Uncategorized | Tagged , , | Leave a comment

Soundcloud oEmbed test

mophttp://youtu.be/9jrjZwy7knY

Posted in Uncategorized | Leave a comment

Audio test

text

text2

text3

Posted in Uncategorized | Leave a comment

New Installation of The Commons including Buddy Press Part 3

Caching, Buddy Press and Joss’s bits’n’peices

Caching and Speed-ups

I’m using some of the things Joss and I talked about but, also, this page on optimising WordPress.

This is the second time I’ve looked at caching in WordPress.  Our WordPress is implemented as multi-site (WPMU).  We have a hand full of plugins which combined with WPMU  and, say, the version of PHP we have on CentOS didn’t work with the caching solutions I tried then.  I didn’t have the time, and again I don’t now, to spend too much time trying to get caching working.  I’m going to roll the dice a few times and check against our most complicated blog created by Rob Watson.  Here goes:

  • APC and APC Object Cache Backend didn’t work with the Twitter widgets in the test blog. Not sure of that now because the radio blog is using HTTPS plugin which can confuse web browser with mixed http/https.
  • Expires Headers in .htaccess
    <FilesMatch "\.(ico|jpg|jpeg|png|gif|js|css|swf)$">
    ExpiresActive on
    ExpiresDefault "access plus 30 days"
    Header unset ETag
    FileETag None
    </FilesMatch>

didn’t break anything

  • Installed WP_DBManager and switched on the optimise and repair database functions.  There is potential for damage here (I’m not a fan of MySQL)
  • eAccelerator switched on.  88.99 MB/sec for the iSCSI, 170.54 MB/sec for ‘local’ (VM) disk. 254.87 MB/sec for a RAM disk!!!! If the VM had enough memory I could create the eAccelerator cache in RAM.  Need to measure the size of the cache over time.
  • WP Super Cache turned on, claimed web pages served from 2 to 250, but may be a little too hairy for users as the settings are per blog.  <—
  • Even though the servers side of things may be very quick now the radio blog still takes about 5 seconds to render.  This is likely to do with the amount of client side rendering to be done by JavaScript by plugins like Twitter.  Using Chrome (don’t do this at home, folks) can help in this respect.

Now for the reason we’re here today: Buddy Press

Nicky Rowe had a quick look at Buddypress BP on our install.

Here are the issues so far encountered:

  • When https is in the address bar, clicking on users’ names sends you to their network-wide Buddypress profile (like it should), but when the link is just http, it tries to send you to a non-existent profile on the individual blog. This happens on both Buddypress and non-Buddypress themes. From what little I’ve read it seems to be a problem with permalinks or http headers or something.
  • Not all members are showing up on the Members list – apparently this is because it will only show members who have logged in after BP was installed. The way this laptop is set up it might not be possible to test non-admin users.
  • You can only invite people to join a group if you are already ‘friends’ with them – probably a bit of a hassle for staff who might want to use BP like Blackboard, but there is a plugin called Invite Anyone that will let invites go to anyone.
  • Sometimes blogs don’t show up on the list of blogs – sometimes I click on Blogs and it shows 30 blogs, other times it shows 350. Will keep an eye on this.
  • If a user makes a post on a blog, but that blog has been set to block search engines, then it won’t show up on the Buddypress activity stream, and therefore the Commons portal when it’s built. Users will need to be notified to enable it if they want to appear on the portal.

So, here goes.  I’m going to install the plugin via the Install Plugins page…boom!  I’ve accepted, via the wizard, all of the defaults and installed the BuddyPress Template Pack.

Soon, I’m asked to ‘Repair’ BP by creating new pages and associating them with features of BP:

I’ve named them like this to make them stand out against previously authored pages but also pages named similarly will help build a language around The Commons that will be distinct and, possibly, less confusing to users e.g. “Would you like a Spot on The Commons?”

I’m changing the page name to the implicit titles because the BP tools bar uses the implicit titles.  Anything else would be confusing.

I’m pleased to find that user’s blogs aren’t touched by the change but I want to convert my blog to be compatible with BP.  Initially, I tried to use the theme Custom Community which is BP compatible but all I got after Network Activating and using it was a 500 message with out any more detail. Then I discovered that the BuddyPress Default needs to be Network Activated.

Added plugins:

  •  Subscribe to Comments
  • New Blog Defaults
  • Suffusion BuddyPress Pack
  • BuddyPress Template Pack
  • Buddypress Toolbar
  • BP MPO (More Privacy Options)
  • BuddyPress Group Email (is a pay for plugin)
  • BuddyPress Group Email Subscription
  • BP Mobile
  • BP External Group Blogs
  • Custom headers and footers
  • Buddypress Ajax Chat
  • Buddypress Twitter
  • Google Plus and Facebook buttons ???
  • Klout???
  • BuddyPress Registration Groups
  • BuddyPress Album
  • Achievements for BuddyPress ???
  • Buddypress Topic Mover
  • Can’t connect to Jetpack because I’m on the dev machine which can’t be seen from the web.
  • Looking at Status as a responsive front page.  Looks useful but not there (for the effort I’ve put in.)
  • Bebop by JISC
  • Twitter Cards Participate
  • Enable BuddyPress themes for the main blog only
  • Piwik ??? If we configure it, how does it relate to WPMU? How do users check the stats for their blogs?
  • CollabPress Project Management
  • wp-content/plugins/buddypress/bp-themes/bp-default/_inc/css/default.css:
#header h1 a {
 color: #fff;
 font-size: 56px;  <----
 text-decoration: none;
 margin-left: 250px;  <----
}
  • Update avatar for OurAdmin

Images, Video and Audio with HTML5 and Flash support

oEmbed is a technique that can be support by a website supplying content, for example, YouTube and Twitter:

For an example of embedding YouTube:

Click on the Share button to discover the link and then, just, paste it in to the blog:

http://youtu.be/Q-jAGdD1H8g

The problem with oEmbed, of course, is that the original source may get deleted.  Taking a copy has licensing issues as, might, embedding the object.

Better control of your media library:

Video.  Display of video is tricky.  For compatibility across browsers video needs to be stored in MP4 (including h.264), WebM and Ogg Vorbis Theora.  Miro Video Converter will do the hard work.

  • Disabled WP YouTube Player and Youtuber which will affect blog entries that have already used them.
  • MoviesHTML5 Video (on supported browsers), Flash fallback, CSS-skin’d player, hMedia Micro-formats, attach images to videos (when used with Shuffle). Uses [movie] tag.
  • Degradable HTML5 audio and video which we’ve had for a while.  Uses [ video ] and [ audio ] tags.
  • BuddyPress Media Component

Well, it’s nearly that time.  Tomorrow I’ll be launching this and heading for the hills.  Plugins target, this attachment shows the plugins I’ve settled on.  These will no doubt change over time.  It should be possible to see the difference between The Commons I and The Commons II by comparing this blog entry with the list in the PDF.

One thing to note is that the more plugins you have the more updates the WordPress install needs.  We have to go through Change Management.  I’ll put that differently, we use Change Management to track changes to our services, this is a good thing.  The board meet once a week.  I may need a permanent seat at the table.

Posted in CELT, Library, The Commons | Leave a comment

New Installation of The Commons including Buddy Press Part 2

Installation of the Virtual Machines and copy the production server

A list:

  • Find server infrastructure.  Eventually, the service will run on DMU’s converged private cloud but for now I’m making do with a DL360 G6 which will hopefully have two 4 Core CPUS and 16-32GB RAM.  Disaster recovery will be covered through thorough backup up of what is at the moment a small amount of data.  Users have the ability to upload up to 4.5GB in to a blog at a time…
  • Install CentOS 6.3
    • 400GB, 50GB for the OS and the rest for VMs and storage in /usr1 including install ISOs
    • yum groupinstall “Virtualisation” “Virtualization Client” “Virtualization Platform” “Virtualization Tools” “Web Server” “PHP Support”
    • Setup networking on the host to include br0
      • added this to /etc/sysctl.d/libvirtd (might be crazy)
        • net.bridge.bridge-nf-call-ip6tables = 0
        • net.bridge.bridge-nf-call-iptables = 0
        • net.bridge.bridge-nf-call-arptables = 0
      • wondering about Jumbo frames over br0 for storage???
      • Add tap0 to br0 for better networking :
      • yum install tunctl
        # /etc/rc.local /usr/sbin/tunctl -b
        /sbin/ifconfig tap0 up
        /usr/sbin/brctl addif br0 tap0
    • dd if=/dev/zero of=/usr1/local/storage/commonstg1.img bs=1M seek=204800 count=0 #200GB sparse iSCSI storage image
    • vi /etc/tgt/targets.conf and append
      • <target iqn.2012-09.uk.ac.dmu.blue:commonstg1>
         initiator-address 146.227.XX.XX
         <direct-store /usr1/local/storage/commonstg1.img>
         lun 1
         </direct-store>
        </target>
  • Creating a 20GB (generous today) VM for MySQL
    • virt-install –connect qemu:///system -n CommonsDBVM -r 2048 –vcpus=2 –disk path=/usr1/local/vm/CommonsDBVM.img,size=20 -c /usr/local/iso/CentOS-6.3-x86_64-netinstall.iso –vnc –noautoconsole –os-type linux –os-variant rhel6 –accelerate –network=bridge:br0 –hvm
    • Set up networking (manually)
    • yum update, install rsync wget, groupinstall “MySQL Database client” “MySQL Database server”
    • Change SSHD config to allow passphrases but not passwords (having applied public keys to authorized_keys)
    • (make a decision about SELinux)
    • grab a database backup (and Commons WP backup) for testing the install
      • set MySQL’s admin password
      • change iptables to reflect MySQL comms with Common’s web server
      • mysql -u root -p < backup.sql
      • grant ownership to database for user remembering remote hostname
  • Creating a 20GB (generous today) VM for Commons Web (storage will be iSCSI on VMs’ host for now)
    • yum update, install rsync wget, groupinstall “Web Server”, php-pecl-apc
    • Change SSHD config to allow passphrases but not passwords (having applied public keys to authorized_keys)
    • (make a decision about SELinux)
    • iscsiadm –mode discovery –type sendtargets –portal 146.227.XX.XX
    • iscsiadm –mode node –targetname iqn.2012-09.uk.ac.dmu.blue:commonstg1 –portal 146.227.XX.XX –login
    • tail -f /var/log/messages (to verify and discover your new disk)
    • service iscsi restart (for a laugh)
    • fdisk -l /dev/yourNewDisk (sda for me)
      • create one partition the size of the disk and set to LVM
    • pvcreate /dev/sda1
    • vgcreate vg_commonswebdata /dev/sda1
    • lvcreate -L 199G -nusr1lv vg_commonswebdata
    • mkfs.ext4 -L /usr1 /dev/vg_commonswebdata/usr1lv
    • tune2fs -c -1 -i -1 -m 1 /dev/vg_commonswebdata/usr1lv
    • Add entry to /etc/fstab
      • LABEL=/usr1 /usr1  ext4    _netdev 0 0
    • Reboot the VM with fingers crossed the storage comes up and gets mounted.  For me, wowzers, it’s there.
  • Back on the host list the VMs and make them boot at boot:
  • # virsh list
     Id Name State
    ----------------------------------------------------
     1 CommonsDBVM running
     2 CommonsWebVM running
    [root@winsor ~]# virsh autostart CommonsDBVM
    Domain CommonsDBVM marked as autostarted
    [root@winsor ~]# virsh autostart CommonsWebVM
    Domain CommonsWebVM marked as autostarted
  • Reboot the host and check the VMs come up with the iSCSI storage mounted.  For me “Wuhoo!”
  • Back on CommonsWeb:
    • yum install php-mysql mysql php-ldap and check connection to remote database
    • Configured Apache to reflect current Commons website
    • Changed /etc/hosts to include IP/hostnames for testing
    • restored WP backup (correlating to database restore)
  • Mustn’t forget back ups of host and VMs.  We use Amanda which is super good for Linux and web servers.  Large media files and large VM images are a problem but for now we’re living with it.  On the VMs’ host I exclude ISOs, VMs and the storage as I’m backing up the VMs’ file systems.  This avoids backing up large images because of a small change to a VM and keeps data related to the host in one place.
  • Tune PHP and MySQL
    • MySQL first:
    • [mysqld]
      datadir=/var/lib/mysql
      socket=/var/lib/mysql/mysql.sock
      user=mysql
      # Disabling symbolic-links is recommended to prevent assorted security risks
      symbolic-links=0
      query_cache_limit=1M
      query_cache_size=32M
      query_cache_type=1
      max_connections=3000
      max_user_connections=600
      interactive_timeout=100
      wait_timeout=100
      connect_timeout=10
      thread_cache_size=128
      key_buffer=256M # 64M for 1GB, 128M for 2GB, 256 for 4GB
      join_buffer_size=4M # 1M for 1GB, 2M for 2GB, 4M for 4GB
      max_allowed_packet=32M
      table_cache=1024
      sort_buffer_size=4M # 1M for 1GB, 2M for 2GB, 4M for 4GB
      read_buffer_size=4M # 1M for 1GB, 2M for 2GB, 4M for 4GB
      read_rnd_buffer_size=3072K # 768K for 1GB, 1536K for 2GB, 3072K for 4GB
      max_connect_errors=10
      thread_concurrency=4
      myisam_sort_buffer_size=128M # 32M for 1GB, 64M for 2GB, 128 for 4GB
      skip-locking
      [mysqld_safe]
      log-error=/var/log/mysqld.log
      pid-file=/var/run/mysqld/mysqld.pid
      [isamchk]
      key_buffer=256M # 64M for 1GB, 128M for 2GB, 256M for 4GB
      sort_buffer=256M # 64M for 1GB, 128M for 2GB, 256M for 4GB
      read_buffer=64M # 16M for 1GB, 32M for 2GB, 64M for 4GB
      write_buffer=64M # 16M for 1GB, 32M for 2GB, 64M for 4GB
      [myisamchk]
      key_buffer=256M # 64M for 1GB, 128M for 2GB, 256M for 4GB
      sort_buffer=256M # 64M for 1GB, 128M for 2GB, 256M for 4GB
      read_buffer=64M # 16M for 1GB, 32M for 2GB, 64M for 4GB
      write_buffer=64M # 16M for 1GB, 32M for 2GB, 64M for 4GB

Hmmm…I seem to have a working copy of our website spread across two VMs.

Posted in CELT, Library, The Commons | Tagged , , , , , , , , | 1 Comment

New Installation of The Commons including Buddy Press Part 1

Introduction

The Commons has been running as a pilot since February 2011.  It has just received its mandate to be a service.  One that CELT at DMU is excited about.  The hope of The Commons is to bring communities at DMU together.  Initially, we have kept it simple.  Anyone who is able to log in to services as a user at DMU has been able to create an account on The Commons and in doing so automatically have a blog created for them.  Every public blog entry a user writes automatically get aggregated on the panoramic pages of The Commons.  Visitors can see instantly what the community (using the blogs) are up to by looking at the tag cloud.  That, so far, has been the extent of the effort.  Rob Watson in CTS in the Faculty of Technology has taken it a bit further.  He has encouraged students to  categorise their blog entries and has aggregated them as a radio production blog.  POD are using the blogs for training.  The Executive Board are also using the blogs.

The Conversation

Now, we are going take things a little further by introducing, this term, the functionality from Buddy Press.  Buddy Press promises the opportunity to use forums, instant (including private) messaging, self-organising and hundreds of plugins.  Lincoln has been running with Buddy Press on top of WordPress Multisite for a while and I spoke to Joss Winn about their install.  These are my (poor) notes:

  • Lincoln have 3,000 users and 1,200 blogs in 6GB
  • Use Buddy Press
  • Caching…use caching capibilties of Apache/PHP -> APC rather than a WordPress plugin
  • Connect LDAP to WordPress (done)
  • oEmbed automatic since WordPress 2.9 but can be expanded to include more oEmbed websites
  • Unfiltered MU prevents WordPress MU/WordPress 3.0 multisite from stripping <iframe>, <embed>, etc. for Administrators’ and editors’ posts
  • Joss recommended a subscription to WPMU dev
  • Search Engine Optimisation (done)
  • Joss recommended an Akismet subscription to prevent spam.  We use GrowMap Antispam       and SI Captcha Antispam
  • Subscribe to comments enables commenters to sign up for e-mail notification of subsequent entries
  • Must Use plugins.  Plugins that will be turned on for every user (e.g. antispam)
  • New Blog Defaults default settings for new blogs
  • Recipe: WordPress, WordPress MU, Buddypress, Buddypress plugins
  • Buddy Press social network lives under root blog. We discussed the web SSL certificate.  Ours doesn’t support the root domain and so we need a new one.
  • a Create Blog on every page (needs investigation)
  • Allow staff the ability to grant external users authority
  • BP More Privacy Options Activity filter plugin
  • BP Group email plugin
  • BP Mobile plugin for a better experience on mobile devices
  • BP External Group Blogs plugin linking RSS activity to email
  • Custom headers and footers plugin on every blog
  • MU plugins folder for footers (??? I can’t remember what this was ???)
  • WP LaTeX (done)
  • Hotfix plugin for patches with will be rolled in to an official update (??? #CAB)
  • Last two are sketchy: Site tags limit blogs, Statistics about use of blogs and themes

 

Posted in CELT, Library, The Commons | Tagged , , , , , , , , , | Leave a comment

Create an affiliate search with alert on Web of Knowledge (Science)

A quick note because this was tricky.

Go to Web of Knowledge and click on the Web of Science tab. Click on “Advanced Search” and enter something like:

OG = “De Montfort Univ”

hit “Search”.

You will get a table that lists your search and in blue the number of results.  Click on “Save History / Create Alert”.  If you’re registered and logged in then you’ll be able to set up your alert then.  Otherwise, once you’ve registered you’ll come back to this page.

Seems easy but it took 2 librarians, an ITMS and phone-a-friend to find it.

Posted in Library | Tagged , | Leave a comment

TV Project (2008)

This PDF describes how DMU could (has) implemented TV capture and playout: TV Project

It starts:

With the uptake of an ‘upgrade’ to the TV licence the library may now capture off air programmes to disk and then stream those programmes to any computer desktop within the university’s firewall. The new system will eventually replace the current record to video/dvd and catalogue in the LMS system. The web team have been looking at the technology of capturing off air recordings and streaming them for over two years. The testing of research began after a visit to the International Broadcasters Conference IBC 2006. Many off-the-shelf solutions are close to the system that will be implemented. One expensive solution from Anevia would match our needs, for now, but in talking to their representatives on the Anevia stand and to other companies’ representatives Open Source tools feature greatly among the offerings. We know now that we can implement this project using Open Source tools combined with some appropriate hardware.

Posted in DMU TV Project, Library | Tagged , , | Leave a comment