Geek, part-time cyclist, ecoist, bad photographer, vegetarian. I fiddle with WordPress and Edu Tech.

Roasted chickpea and broccoli burritos

Makes 6-8

Shamelessly taken from Thug Kitchen because it’s so nice! If you think the spice blend might be a bit hot, feel free to leave out the cayenne or reduce the chili powder.

You will need:

  • 2 400g tins of cooked chickpeas
  • 1 large white onion
  • 1 red pepper
  • A large head of broccoli
  • 3 cloves of garlic
  • A large baking tray
  • 6 – 8 tortilla wraps
  • 1 avocado
  • Some mixed leaves
  • Anything else you like to put in burritos

For the spice blend:

  • 2 teaspoons of chili powder
  • 1 teaspoon of ground cumin
  • 1 teaspoon of paprika
  • 1 teaspoon of ground coriander
  • Half a teaspoon of cayenne pepper
  • 1-2 tablespoons of soy sauce
  • 3 tablespoons of olive oil

Stick your oven on at 220C, whilst that’s heating up peel the onion, deseed the pepper and chop them up along with the broccoli. Aim for chickpea sized pieces, fairly small, not too chunky.

Place all the chopped veg in a large bowl and add the drained, cooked chickpeas. Pour in the olive oil, soy sauce, give it all a good stir and throw in the spices, mix until it’s all covered.

Pour the mixture out onto your baking tray and bake for 20 minutes.

Either chop or mince your garlic then add it to the baking tray mix, give it all a good stir and return it to the oven for 10-15 minutes.

The broccoli probably looks burnt, but that’s the plan, it’s not really burnt, it’s just full of deliciously crispy flavour.

Now to make your burritos! Halve, peel, deseed and slice up that avocado, add some leaves or whatever else you want in your burrito, maybe some hot salsa!

Roll, serve and wait for the party to happen in your mouth.

Pasta with a creamy avocado sauce

Serves 4

How hungry are you? Can you wait 12 minutes, because that’s exactly how long it takes to make this simple and delicious meal.

You will need:

  • 400g of your favourite pasta, I used linguine
  • 2 ripe avocados
  • A few basil leaves
  • 2 cloves of garlic
  • 2 tbsp of lemon juice
  • Olive oil
  • Salt and pepper
  • A 200g tin of sweetcorn
  • A few cherry tomatoes
  • Some kind of food processor

Right, fill a large pan with boiling water, stick your pasta in it and cook it however it says on the packet.

Whilst that’s doing its thing we can make the sauce, halve your avocados, peel them and get rid of the seed, stick them in the food processor along with the basil leaves, peeled garlic and a twist of salt & pepper to taste.

Process it all together until smooth.

With the processor running drizzle in 2 or 3 tablespoons of olive oil until the sauce emulsifies, you’ll see the sauce change in consistency and thicken. You might not have to put all of the oil in so see how you get on.

Drain your sweetcorn and quarter your cherry tomatoes.

Your pasta should be done by now (see it’s that quick), drain it and return it to the pan, stir in your avocado sauce, sweetcorn, tomatoes and serve!

Adapted from:

Roasted aubergine and red pepper couscous with halloumi cheese recipe

This is fairly quick to knock up and makes enough for 4 portions, it’s also easy to make vegan by replacing the halloumi cheese with sliced tofu.

Ingredients – Makes 4 portions

220g uncooked couscous

Olive oil

Freshly ground black pepper

1 vegetable stock cube (I use a Knorr stock pot)

330ml boiling water

1 large or 2 medium red onions

3 cloves of garlic, minced

1 large aubergine

2 sweet red peppers

1 250g block of halloumi or 1 block of tofu

1 can of cooked green lentils (Lentilles Vertes)

A drop of truffle oil – Optional


Right, heat your oven to 200C, I’ve got a fan oven.

De-seed your peppers, chop them into pieces, make them however big you want.

Chop the aubergine into slices and then into pieces, again, any size you want but bear in mind you want all this to roast fairly quickly.

Throw the pepper and aubergine on baking tray, drizzle with olive oil, add a twist of black pepper and mix it all together.

Put that in the oven to roast and grab the onion(s).

Peel and chop the onions, put them in a large pan with a drizzle of olive oil and a twist of black pepper, cook over a medium heat until they’re turning soft and almost translucent. Add the minced garlic and cook for a few more minutes but don’t over cook the garlic.

Once the veg has been roasting in the oven for 10 minutes take it out, give it a shake and a turn and put it back in the oven for 10 minutes.

Slice the halloumi cheese into 8 slices, add a little olive oil to a frying pan and get it on a medium heat. Add the slices of halloumi and cook until golden brown, keep an eye on them, they can brown quickly. When both sides are done remove from the pan and set aside.

Your roast veg should have had about 20 minutes now, check to see if it’s done, I like it slightly crispy and almost burnt, each to their own. If it’s done, turn the oven off and get started on the cous cous.

Dissolve the stock pot in 330ml of boiling water, add the couscous to the pan with the onions and garlic and place on a very low heat. Add the stock to the pan, stir to cover the couscous and put a lid on the pan.

After 2-3 minutes give the couscous a stir, it should have swollen up and taken on all the water, if it hasn’t, put the lid back on and give it another minute but keep checking.

Once it’s done, drain the lentils and add them to the pan along with the roasted veg and halloumi cheese.

Add the totally optional drop of truffle oil and give it all a good stir, I know it sounds really pretentious but I think it really makes a difference. If you’ve not got truffle oil and you’re tempted to get some make sure it’s a good quality oil with some real truffle, I’ve got some of the Truffle Hunter oils.

Plesk root password recovery

I was doing a password rotation on a server the other day and for some reason it failed whilst I was updating root. Maybe the password was too long, maybe Virtuozzo doesn’t do proper validity checking but either way I lost access to root.

Luckily if you’ve still got access to a Plesk admin user you can use this to your advantage and get root access back.

First of all you need a user with SSH access, in the Plesk admin panel make sure the account is set up with ‘/bin/bash’ as the root directory and not ‘/bin/bash (chrooted)’.

SSH into the server with that users’ credentials and create two scripts, the first one:

cp /etc/shadow /tmp/shadow.tmp;
chmod 777 /tmp/shadow.tmp;

The second:

cp /tmp/shadow.tmp /etc/shadow;
chmod 640 /etc/shadow;

Place them in /tmp or wherever you want to run them from and name them what you like, I’ll refer to them as and from here on.

Give the scripts execute permissions:

chmod +x /tmp/ /tmp/

Now go back to your Plesk admin panel and go to Server > Tools & Resources > Scheduled Tasks.

Search for or select the user ‘root’ and add a new task.

Enter */1 in the Minute field and * in the rest of them, in the Command field enter the path to your first script, most probably /tmp/

Hit the OK button.

This cron job will run the first script once a minute, on the minute so wait a minute and it will have made a copy of the /etc/shadow file called /tmp/shadow.tmp, check your /tmp folder for this.

Once the file has appeared, remove the task in the Plesk admin panel so it stops copying the file every minute.

In your SSH session open /tmp/shadow.tmp in your favourite editor.

Copy the whole line for a user that you know the password of, you might want to choose the line that matches the SSH user you’re currently using as you definitely know that password.

Replace the line (most probably at the top of the file) for the root user with the one you’ve just copied and then change the username at the beginning of the line from whatever user it is to ‘root’, save your file making sure it’s still called ‘shadow.tmp’.

Now go back to the Plesk admin panel and make a new scheduled task, exactly the same configuration as before but set the command to be ‘/tmp/’.

Hit the OK button on the task and wait 1 minute for the task to run, after a minute remove the task so it doesn’t carry on running the script. If you’ve done everything right you’ll have replaced the password hash for the root user with a known password and you’ll be able to log in as root using this known password.

Once you’ve logged back in change the root password and clear up the files in your /tmp folder.

Let me know how you get on, I know the scripts could be cleaned up and consolidated but I didn’t want to use a delay so that I wasn’t rushed in making sure I’d edited the files in time, it was easier just to run two cron jobs.

Speeding up RAID migration on a Synology DS414 NAS

I’ve had a Synology DS414 NAS for a few weeks now, this post is about how to change the default settings of mdadm, the tool used to manage software RAID, to speed up the process of migrating between RAID levels.

I started out with 2 x 4Tb WD Red drives, they were configured to be a Synology Hybrid RAID (SHR) volume which dynamically changes the RAID level depending on the number of drives you assign to it.

With 2 disks it’ll run in RAID 1, mirroring the data held on the drives, add another disk and it’ll convert the volume to RAID 5, striping the data across the drives for more available space whilst adding parity information to cope with the failure of any 1 of the drives.

Adding the 3rd disk was quick and easy, the DS414 supports hot plugging devices so I just went ahead and put the new drive in, added the drive to the volume and it went ahead and expanded it.

The next part is a bit of a waiting game and depending on the size of the volume it can take a while. Because the DS414 uses software RAID, where there’s no dedicated RAID hardware, it uses the CPU of the device which isn’t the fastest.

I left it overnight and late the next day it had only done about 30%, whilst expanding volumes data is essentially at risk as the array is not redundant. The longer the process takes the longer you’re not protected against disk failure.

There are a few things you can do to speed up the process, SSH to your NAS as admin and enter the following commands (change md3 to your device):

# echo 100000 > /proc/sys/dev/raid/speed_limit_min
# echo 32768 > /sys/block/md3/md/stripe_cache_size

The first command increases the minimum “goal” rebuild speeds for when there’s non-rebuild activity. On my DS414 I never saw this go above 90000KiB.

The second command increases the stripe cache size which increases sync performance by allowing a larger cache to synchronise the read and write operations on the array.  This is only available for RAID 5 volumes and it does decrease the amount of available system RAM but I never saw 100% utilisation on the DS414.

You can monitor the process with the following command:

# cat /proc/mdstat

Once I’d changed these settings the expand operation only took another 12 hours, a total of about 35. It should also work for speeding up volume consistency checks as they both read the same config. Remember the commands above will only set those options until the NAS is rebooted.

Getting ‘eContent’ onto supervised iPads

Managing a large number of iPads can be frustrating, getting content onto them even more so. If you want to manage the applications available on the iPads, Apple Configurator is your man, if you want to manage the content on them you need to find another way to do it.

I was asked to load a self created iBook onto a number of iPads, helpfully the iBooks application doesn’t support iTunes File Sharing, a system that lets you copy files between your computer and apps on iOS devices. Other Apple apps do support it, Pages, Keynote, but not iBooks.

So I thought another way to do it would be to put the iBooks file in Dropbox, download it from Dropbox on iPad 1, open it in iBooks, backup iPad 1 (that now has the iBook installed on it) and restore it to the other iPads that need the iBook on them.

Unfortunately, for reasons not know to me this doesn’t work. The iBook appears in the iBooks application, but without the iBooks’ cover art. Tapping on the icon starts something, but after a second the icon disappears and the bookshelf goes back to being empty.

I could have gone to each iPad and logged into Dropbox to get the file on each of them, but I was trying to keep the workflow short. The method I finally came up with, still involves some manual interaction, but cuts it down.

First of all I uploaded the .ibooks file to a public facing web server, when you link to files on Dropbox it doesn’t just serve up the file immediately, you need to click on a download link first. Hosting the file on a normal web server gets rid of this step.

In Apple Configurator I created a new profile, in the profile I configured a new Web Clip, a Web Clip is just a link to a specific site or web page that creates an icon on the home screen.

Apple Configurator - Web Clip

Apple Configurator – Web Clip

The title can be anything, the URL is the link to the file you uploaded earlier. I left all the other settings, hit Save and applied the profile to all the devices that needed the iBook.

On each device all I had to do was tap the new icon on the home screen, it opens Safari and navigates to my .ibooks file on the webserver, it gives the option of opening the file in the iBooks application and that’s it, the iBook is saved to the application.

Once the iBook is on the device you’ll want to remove the profile from the devices to remove the icon from the home screen.

Whilst it’s more difficult to get iBooks content onto devices via Apple Configurator, Adobe Acrobat Reader does support iTunes File Sharing so you could drop PDF files onto the devices fairly easily but we specifically needed iBooks support.

Apple Configurator 1.4, iOS 7 and Eduroam

Today I sat down to configure a set of Apple iPads to connect to our institutional Wi-Fi network, we use eduroam based around a WPA2 setup.

For anyone who has already used Apple Configurator you’ll know it’s pretty straight forward. Enter a few network details, give it a certificate if needed, save and refresh your devices.

It didn’t go as easy as that. I’d previously setup an Apple TV to connect to the network so I knew I could use Apple Configurator to do what I needed, I went about entering the network details, SSID, Security Type, Protocols and Trusts etc. But whenever I pushed the profiles to the devices they wouldn’t connect to the wireless network.

Apple Configurator

Apple Configurator – Getting the right settings for Eduroam

It all came down to the Security Type setting, although we use WPA2 Enterprise it didn’t seem to like that option and only when (3 hours later) I tried Any (Enterprise) did it actually work.

php5-ldap upgrade troubles

All my php5-* packages are at version 5.3.2-1ubuntu4.19 apart from php5-ldap which is at 5.3.2-1ubuntu4.18.

Trying to update any package on the system with apt the server tries to sort out the unmet dependencies of php5-ldap, it says it needs php5-common 5.3.2-1ubuntu4.18 when I have 5.3.2-1ubuntu4.19 installed.

I don’t want to remove php5-common 5.3.2-1ubuntu4.19 and all of the things that depend on it. Can you tell what it’s moaning about in the errors or how to easily fix it? It’s Ubuntu 10.04.4.

root@host:~# apt-get -f install php5-ldap
Reading package lists… Done
Building dependency tree
Reading state information… Done
The following packages will be upgraded:
1 upgraded, 0 newly installed, 0 to remove and 61 not upgraded.
18 not fully installed or removed.
Need to get 0B/19.9kB of archives.
After this operation, 0B of additional disk space will be used.
(Reading database … 70664 files and directories currently installed.)
Preparing to replace php5-ldap 5.3.2-1ubuntu4.18 (using …/php5-ldap_5.3.2-1ubu                                                                             ntu4.19_amd64.deb) …
Unpacking replacement php5-ldap …
dpkg: error processing /var/cache/apt/archives/php5-ldap_5.3.2-1ubuntu4.19_amd64                                                                             .deb (–unpack):
unable to install new version of `./usr/share/doc/php5-ldap’: No such file or d                                                                             irectory
Errors were encountered while processing:
E: Sub-process /usr/bin/dpkg returned an error code (1)

‘ vzpkg update -q -p 513909 php5-ldap’ exec failed – dpkg: error processing /vz/template/ubuntu/10.04/x86_64/pm//archives/php5-ldap_5.3.2-1ubuntu4.19_amd64.vz.deb (–unpack):
unable to install new version of `/usr/share/doc/php5-ldap’: No such file or directory
No apport report written because the error message indicates an issue on the local system
Errors were encountered while processing:
E: Sub-process /usr/bin/dpkg returned an error code (1)
Error: /usr/bin/apt-get failed, exitcode=100


The following packages have unmet dependencies:
php5-ldap: Depends: php5-common (= 5.3.2-1ubuntu4.18) but 5.3.2-1ubuntu4.19 is installed


root@lvps217-199-163-74:/var/cache/apt/archives# dpkg –debug=777 -i php5-ldap_5.3.2-1ubuntu4.19_amd64.deb
D000010: ensure_pathname_nonexisting `/var/lib/dpkg/’
(Reading database … 70664 files and directories currently installed.)
Preparing to replace php5-ldap 5.3.2-1ubuntu4.18 (using php5-ldap_5.3.2-1ubuntu4.19_amd64.deb) …
D000200: process_archive conffile `/etc/php5/conf.d/ldap.ini’ in package php5-ldap – conff ?
D000020: process_archive conffile `/etc/php5/conf.d/ldap.ini’ package=php5-ldap same hash=2dbdb6d30c646b5eb6b76d1fd71abe4f
D000200: oldconffsetflags `/etc/php5/conf.d/ldap.ini’ namenode 0x25b90d0 flags 5
D000001: process_archive oldversionstatus=installed
D000002: maintainer_script_alternative nonexistent prerm `/var/lib/dpkg/info/php5-ldap.prerm’
D000002: maintainer_script_new nonexistent preinst `/var/lib/dpkg/’
Unpacking replacement php5-ldap …
D000010: tarobject ti->Name=`.’ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/.’ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/.’ tmp=`/..dpkg-tmp’ new=`/..dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./usr’ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/usr’ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/usr’ tmp=`/usr.dpkg-tmp’ new=`/usr.dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./usr/lib’ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/usr/lib’ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/usr/lib’ tmp=`/usr/lib.dpkg-tmp’ new=`/usr/lib.dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./usr/lib/php5′ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/usr/lib/php5′ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/usr/lib/php5′ tmp=`/usr/lib/php5.dpkg-tmp’ new=`/usr/lib/php5.dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./usr/lib/php5/20090626′ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/usr/lib/php5/20090626′ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/usr/lib/php5/20090626′ tmp=`/usr/lib/php5/20090626.dpkg-tmp’ new=`/usr/lib/php5/20090626.dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./usr/lib/php5/20090626/’ Mode=644 owner=0.0 Type=48(-) ti->LinkName=`’ namenode=`/usr/lib/php5/20090626/’ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/usr/lib/php5/20090626/’ tmp=`/usr/lib/php5/20090626/’ new=`/usr/lib/php5/20090626/’
D000100: tarobject already exists
D000010: ensure_pathname_nonexisting `/usr/lib/php5/20090626/’
D000010: ensure_pathname_nonexisting `/usr/lib/php5/20090626/’
D000100: tarobject NormalFile[01] open size=55416
D000100: tarobject nondirectory, `link’ backup
D000100: tarobject done and installation deferred
D000010: tarobject ti->Name=`./usr/share’ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/usr/share’ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/usr/share’ tmp=`/usr/share.dpkg-tmp’ new=`/usr/share.dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./usr/share/doc’ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/usr/share/doc’ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/usr/share/doc’ tmp=`/usr/share/doc.dpkg-tmp’ new=`/usr/share/doc.dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./etc’ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/etc’ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/etc’ tmp=`/etc.dpkg-tmp’ new=`/etc.dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./etc/php5′ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/etc/php5′ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/etc/php5′ tmp=`/etc/php5.dpkg-tmp’ new=`/etc/php5.dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./etc/php5/conf.d’ Mode=755 owner=0.0 Type=53(d) ti->LinkName=`’ namenode=`/etc/php5/conf.d’ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/etc/php5/conf.d’ tmp=`/etc/php5/conf.d.dpkg-tmp’ new=`/etc/php5/conf.d.dpkg-new’
D000100: tarobject already exists
D000100: tarobject Directory exists
D000010: tarobject ti->Name=`./etc/php5/conf.d/ldap.ini’ Mode=644 owner=0.0 Type=48(-) ti->LinkName=`’ namenode=`/etc/php5/conf.d/ldap.ini’ flags=7 instead=`<none>’
D000200: conffderef in=’etc/php5/conf.d/ldap.ini’ current working=’/etc/php5/conf.d/ldap.ini’
D000020: conffderef in=’etc/php5/conf.d/ldap.ini’ result=’/etc/php5/conf.d/ldap.ini’
D000020: tarobject fnnf_new_conff deref=`etc/php5/conf.d/ldap.ini’
D000100: setupvnamevbs main=`/etc/php5/conf.d/ldap.ini’ tmp=`/etc/php5/conf.d/ldap.ini.dpkg-tmp’ new=`/etc/php5/conf.d/ldap.ini.dpkg-new’
D000100: tarobject already exists
D000010: ensure_pathname_nonexisting `/etc/php5/conf.d/ldap.ini.dpkg-new’
D000010: ensure_pathname_nonexisting `/etc/php5/conf.d/ldap.ini.dpkg-tmp’
D000100: tarobject NormalFile[01] open size=54
D000200: tarobject conffile extracted
D000010: tarobject ti->Name=`./usr/share/doc/php5-ldap’ Mode=777 owner=0.0 Type=50(l) ti->LinkName=`php5-common’ namenode=`/usr/share/doc/php5-ldap’ flags=2 instead=`<none>’
D000100: setupvnamevbs main=`/usr/share/doc/php5-ldap’ tmp=`/usr/share/doc/php5-ldap.dpkg-tmp’ new=`/usr/share/doc/php5-ldap.dpkg-new’
D000100: tarobject nonexistent
D000010: ensure_pathname_nonexisting `/usr/share/doc/php5-ldap.dpkg-new’
D000010: ensure_pathname_nonexisting `/usr/share/doc/php5-ldap.dpkg-tmp’
D000100: tarobject SymbolicLink creating
D000100: tarobject new – no backup
dpkg: error processing php5-ldap_5.3.2-1ubuntu4.19_amd64.deb (–install):
 unable to install new version of `./usr/share/doc/php5-ldap’: No such file or directory
D000010: cu_installnew `/usr/share/doc/php5-ldap’ flags=2
D000100: setupvnamevbs main=`//usr/share/doc/php5-ldap’ tmp=`//usr/share/doc/php5-ldap.dpkg-tmp’ new=`//usr/share/doc/php5-ldap.dpkg-new’
D000100: cu_installnew not restoring
D000100: unlinkorrmdir `//usr/share/doc/php5-ldap.dpkg-new’ unlink OK
D000010: cu_installnew `/etc/php5/conf.d/ldap.ini’ flags=217
D000100: setupvnamevbs main=`//etc/php5/conf.d/ldap.ini’ tmp=`//etc/php5/conf.d/ldap.ini.dpkg-tmp’ new=`//etc/php5/conf.d/ldap.ini.dpkg-new’
D000100: cu_installnew not restoring
D000100: unlinkorrmdir `//etc/php5/conf.d/ldap.ini.dpkg-new’ unlink OK
D000010: cu_installnew `/usr/lib/php5/20090626/’ flags=602
D000100: setupvnamevbs main=`//usr/lib/php5/20090626/’ tmp=`//usr/lib/php5/20090626/’ new=`//usr/lib/php5/20090626/’
D000100: cu_installnew restoring atomic
D000100: unlinkorrmdir `//usr/lib/php5/20090626/’ unlink OK
D000002: maintainer_script_new nonexistent postrm `/var/lib/dpkg/’
D000002: vmaintainer_script_installed nonexistent postinst
D000010: ensure_pathname_nonexisting `/var/lib/dpkg/’
D000010: ensure_pathname_nonexisting running rm -rf
D000010: ensure_pathname_nonexisting `/var/lib/dpkg/reassemble.deb’
Errors were encountered while processing:

502 Bad Gateway error

Yesterday I was configuring WordPress and Jetpack when I got a 502 Bad Gateway error error from Nginx.

It’s simple enough to fix but I’ve not used Nginx too much, here’s the problem and solution…

When you’re using Nginx as a reverse proxy to Apache sometimes the upstream server can send headers too big to fit in the Nginx buffer. We’re using Plesk Control Panel on our server, so I’m not sure if it’s to do with the defaults set for Nginx within that but to fix it do the following.

Open /etc/nginx/nginx.conf in your favourite editor

Add the following inside the http directive:

proxy_buffers 8 32k;
proxy_buffer_size 64k;

Save it, restart Nginx and Apache, ta da!

Your mileage may vary so play about with the number of buffers and their sizes.

Show Us Your Art

A couple of my pictures are in the Show Us Your Art exhibition this weekend!

« Older posts

© 2017

Theme by Anders NorenUp ↑