Fast, Reliable & Affordable Web Services
  • email marketing ...
  • content management ...
  • fastest VPS hosting ...
  • new SEO techniques ...
  • Online business drivers ...


An Apple a day?

Actually they seem to be selling a great deal more than that. I’ve been reading the recent (not so recent) news of Apple’s sales figures. See:

This is important for Colrosa as a company that undertakes software product development for our clients.

It seems that Apple has made a 49% increase in sales (units sold) in desktops and a 37% increase in portables from Q3 in 2007 to Q3 in 2008. These figures are staggering and have a potential impact on the sorts of decisions software developers make on their choice of development platform and delivery platform for their software applications.

This slightly older article (May 2008):

relates that “Apple’s retail market share is 14 percent, and two-thirds for PCs costing $1,000 [in the US] or more”, and this was prior to Apple Q3 figures announced in July.

I haven’t been able to find any figures on SME, corporate and home use market share, which of course would greatly impact this issue:

If I am considering developing a software product, do I: develop for MS Windows only, MS Windows and MacOS X or indeed MacOS X only?

However, this:

and other articles indicate a 4.5% penetration into the business market (Aug 2008) in general, much lower than the overall market share. Not being a market analyst I can’t say if the business market, being late adopters, are likely to follow the general or home use trend. It seems to me though that (logically, but without proof) this increase may be in the SME market, where personal preference can be more important than corporate IT strategy.

Cross platform development greatly limits the technologies and languages that one might choose to develop with, yet commercially is making more and more sense.

This is still an open question and one that is swayed by the vertical market that you might be developing for. It pretty clear that an application aimed at graphic designers would be more profitable if cross platform. Perhaps something in the legal sector would not benefit at all. But, what about an application of ubiquitous use across both the SME and corporate market? Where would we draw the line?

What are the development options?

Please post a comment if you know differently!

  1. C++ (option 1) with a common interface library, e.g. QT
  2. C++ (option 2) with separate interface code
  3. Java – compile once and debug everywhere.
  4. scripting and GTK (I haven’t really looked in to this, but understand that this would be possible)

Choices would depend on the simplicity/complexity of the UI design and therefore its percentage of the build. For example if I were building a local file indexing engine (a la spotlight for the mac) with a half a dozen user settable preferences and an on/off button I would not need to worry about re-writing the UI. If, however, I was building a diagramming tool, then my first concern would the the common UI tools between platforms.

The other important factor to consider is the additional cost to developing cross platform. My experience of developing cross platform in Java is that it is about 10-15% more expensive (depending on various factors), than developing for one platform.

My last thought on the matter is this: Mac users often have less choice in applications and are usually prepared to pay for something that works well, simply and with grace; right or wrong this is why they went for the Mac and MacOS X in the fist place. I believe that users are often as concerned about their experience with an application as they are about the raw features and as often as not confuse the two: rating more usable applications as more effective in getting a job done.

Setup of a Linux Web Server in 128Mb of Memory

Centos 5 as a LAMP Server in a Low Memory Situation

The results

Lets get right to the point! I’ve setup a working VPS Centos 5 LAMP server in 128MB of memory and it hasn’t gone into swap yet. It typically runs in just over 100Mb of memory and performs really well. So I thought I’d share the process of setting it up in our blog.

I have been using Gradwell to host our websites and our clients’ sites (and VOIP) for about a year now and I’ve been very happy with their service. However, I wanted the flexibility of control that my own server could provide, but I couldn’t justify the cost of new server and its hosting.

Finding a VPS Hosting Company

The solution of course was a virtual server for a number of reasons:

  1. cost
  2. manageability – no hardware to worry about (SEP)
  3. the green factor. VMs are more energy efficient and this fits with our ethos as a company.
  4. portability

I had look around, but decided to see if it was possible for Gradwell to support the features I needed through their hosting package first. They came back to me suggesting their new VPS hosting package (which isn’t on their website yet) and of course I was happy to with them.

When I had a look around there were quite a few different offerings. At the time of writing this article (July 2008) companies are either using VMWare or Xen, some were using Windows Virtualization.

As I’m already with Gradwell, I decided to go with their offering (Centos 5 on Xen). Not least because they offered a Centos 5 VPS, which is exactly what I wanted.

Other Factors to Consider

You may be looking to buy a VPS on a monthly basis to host your websites and databases, but do you want to host and manage these yourself:

  • email (pop or IMAP boxes)
  • email forwarding
  • DNS
  • backup – where are you going to host your off site backup?

In my case I did not – I have enough to worry about. The IMAP demon can be memory and CPU intensive and SPAM detection even more so. DNS should always be hosted on a machine other than your web server in my opinion and with off-site backup you really do need another machine – not a lot of choice there!


VPS systems are typically costed as a factor of memory, share of CPU (how many concurrent systems on a real machine) and disk space. Because I’m in the development cycle of the project that this VPS is really going to be used for and I only had a few small, low volume sites to host (and because upgrading memory is easy on a VPS) I’ve gone for their lowest memory spec. machine – 128MB.

The configuration was as follows:

CPU: sitting on a dual quad core machine under Xen
Memory: 128Mb
3GB of disk space
Standard Centos 5 install:
Apache 2.2.3
MySQL 5.0.22
PHP 5.1.6

With a small amount of memory we needed to optimize the set-up so that the system does not page. The rest of this blog covers the set-up and optimization of the new VPS server, service by service.


The first and obvious question to ask was what services do I not need?

I reviewed the running services and dropped the use of some by removing the S links in /etc/rc3.d:

  • I’ve removed the link S26hidd -> ../init.d/hidd located in /etc/rc.d/rc3.d to stop the hidd (bluetooth demon from starting)
  • I’ve removed the link S97yum-updatesd -> ../init.d/yum-updatesd located in /etc/rc.d/rc3.d to stop the yum notification demon running.
  • I’ve removed the link S90xfs -> ../init.d/xfs located in /etc/rc.d/rc3.d to stop the x font server which is not needed.
  • I’ve removed the link S18rpcidmapd -> ../init.d/rpcidmapd in /etc/rc.d/rc3.d to stop NFS server working as I don’t need to use NFS
  • I’ve removed the link S95atd -> ../init.d/atd in /etc/rc.d/rc3.d as not using the “at” command
  • I’ve stopped smartd – haven’t removed the link in rc3.d yet, as I not sure is is needed under Xen – this need more research. However, the server seems to be fine for the moment.


Normally I would use rsync to a remote machine. I don’t want to run rsyncd on my VPS and it is not running on the machine hosted by Gradwell that I’m backing up to, so I’ve used tar, sftp, cron and some shell scripts to implement the remote backup for the server. A little more bandwidth is used but it is the memory foot print that I am concerned with.


There are other implementations of sshd that use less memory, however the ones I considered did not support sftp and I need that, so I’m sticking with the standard install.

DNS (bind)

I’m hosting this elsewhere – this saves memory!


Mysql settings for low memory are well documented on their website. Most installs come with some suggest configurations in /usr/share/doc/mysql-server-{mysql-version}/. I actually adapted the file my-medium.cnf found there and incorporated those settings into /etc/my.cnf

here are the resultant settings:

port            = 3306
socket          = /var/lib/mysql/mysql.sock
port = 3306
socket = /var/lib/mysql/mysql.sock
old_passwords = 1
key_buffer = 16M
max_allowed_packet = 1M
table_cache = 64
sort_buffer_size = 512K
net_buffer_length = 8K
read_buffer_size = 256K
read_rnd_buffer_size = 512K
myisam_sort_buffer_size = 8M
max_allowed_packet = 16M
key_buffer = 20M
sort_buffer_size = 20M
read_buffer = 2M
write_buffer = 2M
key_buffer = 20M
sort_buffer_size = 20M
read_buffer = 2M
write_buffer = 2M

Updates and Yum

I’ve setup a cron job that switches yum-updatesd on for a short period during the night as a memory helper. During that time a cron script emails me if updates are needed on the system using the command `yum check-update`


I don’t use perl so I’ve have commented out the line:

LoadModule perl_module modules/

from /etc/httpd/conf.d/perl.conf to save memory

Settings for low memory are changed to:

<IfModule prefork.c>
StartServers 1
MinSpareServers 1
MaxSpareServers 4
ServerLimit 64
MaxClients 64
MaxRequestsPerChild 5000
<IfModule worker.c>
StartServers 1
MaxClients 15
MinSpareThreads 3
MaxSpareThreads 7
ThreadsPerChild 3
MaxRequestsPerChild 200


I have actually increased the min memory from 16M to 40M as many modern PHP applications (SugarCRM, Joomla, WordPress, etc) needs this as a minimum – better it runs and swaps if it needs to than not run at all.


In addition to the standard install I have installed:

  • emacs (needed for me – I can’t help but love it after all of these years)
  • php-mysql (need for everything and I’m not sure why it wasn’t installed already)
  • php-mbstring (need for SugarCRM)
  • php-imap (needed for sugar and other Webmail packages)
  • php-gd (needed for various PHP apps. for image manipulation)
  • vsftpd (joomla wants to ftp to the Unix platform because of file permissions)


These are a few notes on things I did to manage stress free porting of applications from the Gradwell hosted server. Some of these points may in principal be useful to you so I’ve left them in this blog.

MySQL based and static websites were dumped and tar(ed) up along with the Webroot and logs directory, sftp(ed) to the new server and setup. In most cases it was necessary to edit config files and .htaccess files (which I brought into httpd.conf as I was moving from Apache 1.3 to Apache 2.2).


I’ve to mirrored the way Gradwell hosts websites in their directory structure for compatibility, i.e.:


I’ve added this to /etc/skel so that new users get this structure.


I’ve copied files over from the Gradwell server and they have my user (UID) with GID 1000, as this is the way Gradwell do things

All users have group user (GID=1000) It did exist as a group and had value 100 on my VPS, so I changed it to have value 1000. And, I have edited /etc/default/useradd and made GROUP=1000. Ported users need to have their UID set, new users don’t need their UID setting manually and the default GID is now set to 1000, so:

# useradd -n –password <password> <newuser>

is used to add a new user.

Checks and Monitors

I have setup an external HTTP monitoring service, a cron script to monitor server load and memory consumption with the ability restart services if needed and something to monitor disk space.


I’d be interested to hear what others have done, so please feel free to leave your thoughts and comments.

Article by Malcolm

Software Development Project Risks – The Human Side

This is a very short post about software development that I hope will be useful to anyone that finds it. I wanted to put on our blog a five key checkpoints that help identify risks in a software development project, based on the more human aspects. Some of these items seem obvious, other less so. As my background is software development and software development consultancy, this list pertains to that discipline, however many of the ideas cross over into project management in general.

  1. Is the project ‘project managed’? If there is no dedicated project manager with project managements skills, then risks and issue are not likely to be picked up and dealt with.
  2. Is the software development project technically managed? This is quite different from good project management. Good technical management reduces risks, such as: bad choice of technologies; ego programming; bad communication between developers; poor testing; lack robustness, no extendibility and poor documentation.
  3. Is the software development team qualified? It’s not just important to have managers with the right experience, the team need to have the right experience too. Sometimes skill areas are transferable and training can help. In general a software development project team lacking some experience, that has been identified up-front and managed through training and time given to self learning may be an acceptable risk. After all, we all go through the process of learning new skills from time to time. However, asking a team of Cobol programmers to tackle a web integration project on the basis that “they are programmers” and “that is all that is needed”, would most certainly be a recipe for disaster. Of course most software development projects are not so extreme. The risk is that your software development project might be closer to this example that you might think; without looking at it carefully you might never know until it is too late.
  4. Staff attrition: is anyone likely to leave the team before the end of the software development project? This does happen and can cause a real issue if work is so segregated that each team member does not have any real idea of what the others are doing and therefore cannot pick on someone else’s area of work.
  5. Is the client (or project owner) communication good with the software development project team and managers in particular? Is progress well communicated with the client? Do the team understand their client’s expectations and have the project limitations been well communicated with the client? Does the client understand the risk and issues that have come up during the development to-date and do they have a good grasp of the project plan, schedule and current progress? In the end project success is about meeting your client’s expectations and it’s key to at least manage these expectations!

There are many factors that can effect a project adversely – these are my top 5 – what are yours, please comment on this article below …

What is the most cost effective way to process card payments online?

About six week or so ago I posted this question on the Linkedin Website:

I’m looking into setting up an online shop and although I’m well versed in the technologies etc. I’ve less experience (and not for some time) of the banking and payment service providers. Does one still need a payment service provider and a merchant account or can you trade online and take major credit cards just through a single provider and a standard bank account? What sort of fees should I be looking paying?

Followed by these two clarifications:

  • just to clarify – I understand that the expected transactions after the 1st years are about 1500 per month with an average value of £20 inc VAT.
  • The market is to be the UK only at first and then EU after that.

The Response

I had 23 responses giving a variety of advice. A lot of support for Paypal, some against. Some suggested Worldpay and others, Protx, etc. In fact there was no really conclusion. After looking at this carefully myself, I have come to the conclusion that PayPal is the easy option (and therefore not surprisingly not the most cost effective). A serious on-line business needs to take its overheads seriously. A couple of people pointed out Google – thank you for that.

My Conclusions

The criterion by which I have chosen, in the end two suppliers, are: cost, quality and brand (in that order). The two providers that I concluded offer the best deals are:

  • Google Checkout at 1.9% of the transaction.
  • HSBC Secure ePayments at 2.0% of transaction (set-up £200, monthly of £20). Rates negotiable over £50K per annum.

The two options represent a reliable cost effective option for an on-line business taking a larger number of smaller transactions.

Green Action in IT

We have always taken the Green side of life very seriously. At first glance one might not think that there is a great deal that can be achieved by an IT Consultancy such as ours to help the environment; or at least to reduce the impact of our actions on the environment. But, as time has passed I keep bumping into useful things – it seems to be in the zeit-geist more and more. The video below is taken from the Reuters website and talks about what companies are doing and in particular what products are selling at CeBIT 2008.


Using WordPress

For those of you in the know, if you look carefully you may have realised that we use WordPress to build and manage our website. Website content management is something that is very close to my heart, as I’ve designed and built a corporate content management system (CMS) and helped to build many websites that use its functionality. That particular sledgehammer would not be suitable for manging the little nut, that is our website, so WordPress turned out to be a real asset in building the site.

Continue reading

SEO – Search Engine Optimisation

Search Engine Optimisation is often miss-understood — placed on a pedestal and either feared or revered by those who need it.

I’ve been working on optimising the Colosa website. I’ve just started this process and our website has only just launched, so it will be interesting to see how things change over the next few months. What does it mean to optimise a website? Many people will tell you that this is a black art. The truth is rather more straight forward.

Continue reading

NeoOffice Quickstarter

I’ve been using OpenOffice for years under Windows now and StarOffice before that, even on Solaris – might be more than 10 years? Recently my laptop died and I decided to get a new one and although I haven’t had an Apple Mac for about 8 years I decided to buy a Mac as now they run on Intel chips, I can run Windows on it – plus they look and feel great and I like the fact there is a Unix base under OSX. I thought that I would move to using NeoOffice (an OpenOffice derivative for the Mac with native look and feel) and soak up the Mac ethos by using all of their short cuts etc. All is great, but NeoOffice is very slow to start: in excess of 10 seconds after I start the machine and login.

Continue reading

Swearing in development code

I’ve alway found it (swearing in code comments) annoying and very unprofessional and I’ve brought a couple of developers that I’ve employed over the years up on this on more than one occasion. It doesn’t help and it encourages an I’m annoyed attitude rather than an I’m challenged attitude – and I don’t mean verbally challenged, either!

But, we are all human and swear from time to time and so when I came across this I had to put a link in our blog to it:

I won’t tell you what it is – just have a look!

VOIP is still immature

It’s interesting how VOIP is such a hot topic and yet how immature it is in terms of the slickness of its configuration. Many years ago setting up broadband connectivity at home with a fixed IP and an associated domain or domains, a firewall, NAT, DHCP, ADSL, PPP, etc. was a task for the dedicated systems administrator. Now you can buy a box with all of that built in for only a few pounds, indeed most broadband suppliers give them away to their new customers. And, configuration is easy: user-name, password and your providers domain – and all of the other settings are either handed off or have sensible defaults pre-set in the device. VOIP is not like this at all – yet. In a couple of years we’ll all be buying our ADSL wireless routers with our phone and configuration for VOIP will be as simple as ADSL is now.

Continue reading