Actually they seem to be selling a great deal more than that. I’ve been reading the recent (not so recent) news of Apple’s sales figures. See:
http://blogs.eweek.com/applewatch/content/corporate/apple_q3_2008_by_the_numbers.html
http://www.apple.com/pr/library/2008/07/21results.html
This is important for Colrosa as a company that undertakes software product development for our clients.
It seems that Apple has made a 49% increase in sales (units sold) in desktops and a 37% increase in portables from Q3 in 2007 to Q3 in 2008. These figures are staggering and have a potential impact on the sorts of decisions software developers make on their choice of development platform and delivery platform for their software applications.
This slightly older article (May 2008):
http://blogs.eweek.com/applewatch/content/channel/macs_defy_windows-gravity.html
relates that “Apple’s retail market share is 14 percent, and two-thirds for PCs costing $1,000 [in the US] or more”, and this was prior to Apple Q3 figures announced in July.
I haven’t been able to find any figures on SME, corporate and home use market share, which of course would greatly impact this issue:
If I am considering developing a software product, do I: develop for MS Windows only, MS Windows and MacOS X or indeed MacOS X only?
However, this:
http://www.tuaw.com/2008/08/26/forrester-apple-nearly-quadruples-enterprise-share/
and other articles indicate a 4.5% penetration into the business market (Aug 2008) in general, much lower than the overall market share. Not being a market analyst I can’t say if the business market, being late adopters, are likely to follow the general or home use trend. It seems to me though that (logically, but without proof) this increase may be in the SME market, where personal preference can be more important than corporate IT strategy.
Cross platform development greatly limits the technologies and languages that one might choose to develop with, yet commercially is making more and more sense.
This is still an open question and one that is swayed by the vertical market that you might be developing for. It pretty clear that an application aimed at graphic designers would be more profitable if cross platform. Perhaps something in the legal sector would not benefit at all. But, what about an application of ubiquitous use across both the SME and corporate market? Where would we draw the line?
What are the development options?
Please post a comment if you know differently!
Choices would depend on the simplicity/complexity of the UI design and therefore its percentage of the build. For example if I were building a local file indexing engine (a la spotlight for the mac) with a half a dozen user settable preferences and an on/off button I would not need to worry about re-writing the UI. If, however, I was building a diagramming tool, then my first concern would the the common UI tools between platforms.
The other important factor to consider is the additional cost to developing cross platform. My experience of developing cross platform in Java is that it is about 10-15% more expensive (depending on various factors), than developing for one platform.
My last thought on the matter is this: Mac users often have less choice in applications and are usually prepared to pay for something that works well, simply and with grace; right or wrong this is why they went for the Mac and MacOS X in the fist place. I believe that users are often as concerned about their experience with an application as they are about the raw features and as often as not confuse the two: rating more usable applications as more effective in getting a job done.
Lets get right to the point! I’ve setup a working VPS Centos 5 LAMP server in 128MB of memory and it hasn’t gone into swap yet. It typically runs in just over 100Mb of memory and performs really well. So I thought I’d share the process of setting it up in our blog.
I have been using Gradwell to host our websites and our clients’ sites (and VOIP) for about a year now and I’ve been very happy with their service. However, I wanted the flexibility of control that my own server could provide, but I couldn’t justify the cost of new server and its hosting.
The solution of course was a virtual server for a number of reasons:
I had look around, but decided to see if it was possible for Gradwell to support the features I needed through their hosting package first. They came back to me suggesting their new VPS hosting package (which isn’t on their website yet) and of course I was happy to with them.
When I had a look around there were quite a few different offerings. At the time of writing this article (July 2008) companies are either using VMWare or Xen, some were using Windows Virtualization.
As I’m already with Gradwell, I decided to go with their offering (Centos 5 on Xen). Not least because they offered a Centos 5 VPS, which is exactly what I wanted.
You may be looking to buy a VPS on a monthly basis to host your websites and databases, but do you want to host and manage these yourself:
In my case I did not – I have enough to worry about. The IMAP demon can be memory and CPU intensive and SPAM detection even more so. DNS should always be hosted on a machine other than your web server in my opinion and with off-site backup you really do need another machine – not a lot of choice there!
VPS systems are typically costed as a factor of memory, share of CPU (how many concurrent systems on a real machine) and disk space. Because I’m in the development cycle of the project that this VPS is really going to be used for and I only had a few small, low volume sites to host (and because upgrading memory is easy on a VPS) I’ve gone for their lowest memory spec. machine – 128MB.
The configuration was as follows:
CPU: sitting on a dual quad core machine under Xen
Memory: 128Mb
3GB of disk space
Standard Centos 5 install:
Apache 2.2.3
MySQL 5.0.22
PHP 5.1.6
With a small amount of memory we needed to optimize the set-up so that the system does not page. The rest of this blog covers the set-up and optimization of the new VPS server, service by service.
The first and obvious question to ask was what services do I not need?
I reviewed the running services and dropped the use of some by removing the S links in /etc/rc3.d:
Normally I would use rsync to a remote machine. I don’t want to run rsyncd on my VPS and it is not running on the machine hosted by Gradwell that I’m backing up to, so I’ve used tar, sftp, cron and some shell scripts to implement the remote backup for the server. A little more bandwidth is used but it is the memory foot print that I am concerned with.
There are other implementations of sshd that use less memory, however the ones I considered did not support sftp and I need that, so I’m sticking with the standard install.
I’m hosting this elsewhere – this saves memory!
Mysql settings for low memory are well documented on their website. Most installs come with some suggest configurations in /usr/share/doc/mysql-server-{mysql-version}/. I actually adapted the file my-medium.cnf found there and incorporated those settings into /etc/my.cnf
here are the resultant settings:
port = 3306 socket = /var/lib/mysql/mysql.sock [mysqld] datadir=/var/lib/mysql port = 3306 socket = /var/lib/mysql/mysql.sock old_passwords = 1 skip-locking key_buffer = 16M max_allowed_packet = 1M table_cache = 64 sort_buffer_size = 512K net_buffer_length = 8K read_buffer_size = 256K read_rnd_buffer_size = 512K myisam_sort_buffer_size = 8M quick max_allowed_packet = 16M [mysql] no-auto-rehash [isamchk] key_buffer = 20M sort_buffer_size = 20M read_buffer = 2M write_buffer = 2M [myisamchk] key_buffer = 20M sort_buffer_size = 20M read_buffer = 2M write_buffer = 2M [mysqlhotcopy] interactive-timeout
I’ve setup a cron job that switches yum-updatesd on for a short period during the night as a memory helper. During that time a cron script emails me if updates are needed on the system using the command `yum check-update`
I don’t use perl so I’ve have commented out the line:
LoadModule perl_module modules/mod_perl.so
from /etc/httpd/conf.d/perl.conf to save memory
Settings for low memory are changed to:
<IfModule prefork.c>
StartServers 1
MinSpareServers 1
MaxSpareServers 4
ServerLimit 64
MaxClients 64
MaxRequestsPerChild 5000
</IfModule>
<IfModule worker.c>
StartServers 1
MaxClients 15
MinSpareThreads 3
MaxSpareThreads 7
ThreadsPerChild 3
MaxRequestsPerChild 200
</IfModule>
I have actually increased the min memory from 16M to 40M as many modern PHP applications (SugarCRM, Joomla, WordPress, etc) needs this as a minimum – better it runs and swaps if it needs to than not run at all.
In addition to the standard install I have installed:
These are a few notes on things I did to manage stress free porting of applications from the Gradwell hosted server. Some of these points may in principal be useful to you so I’ve left them in this blog.
MySQL based and static websites were dumped and tar(ed) up along with the Webroot and logs directory, sftp(ed) to the new server and setup. In most cases it was necessary to edit config files and .htaccess files (which I brought into httpd.conf as I was moving from Apache 1.3 to Apache 2.2).
I’ve to mirrored the way Gradwell hosts websites in their directory structure for compatibility, i.e.:
/home/[user]/webs/[domain]/htdocs
/home/[user]/webs/[domain]/logs
I’ve added this to /etc/skel so that new users get this structure.
I’ve copied files over from the Gradwell server and they have my user (UID) with GID 1000, as this is the way Gradwell do things
All users have group user (GID=1000) It did exist as a group and had value 100 on my VPS, so I changed it to have value 1000. And, I have edited /etc/default/useradd and made GROUP=1000. Ported users need to have their UID set, new users don’t need their UID setting manually and the default GID is now set to 1000, so:
# useradd -n –password <password> <newuser>
is used to add a new user.
I have setup an external HTTP monitoring service, a cron script to monitor server load and memory consumption with the ability restart services if needed and something to monitor disk space.
I’d be interested to hear what others have done, so please feel free to leave your thoughts and comments.
Article by Malcolm
This is a very short post about software development that I hope will be useful to anyone that finds it. I wanted to put on our blog a five key checkpoints that help identify risks in a software development project, based on the more human aspects. Some of these items seem obvious, other less so. As my background is software development and software development consultancy, this list pertains to that discipline, however many of the ideas cross over into project management in general.
There are many factors that can effect a project adversely – these are my top 5 – what are yours, please comment on this article below …
Malcolm
About six week or so ago I posted this question on the Linkedin Website:
I’m looking into setting up an online shop and although I’m well versed in the technologies etc. I’ve less experience (and not for some time) of the banking and payment service providers. Does one still need a payment service provider and a merchant account or can you trade online and take major credit cards just through a single provider and a standard bank account? What sort of fees should I be looking paying?
Followed by these two clarifications:
I had 23 responses giving a variety of advice. A lot of support for Paypal, some against. Some suggested Worldpay and others, Protx, etc. In fact there was no really conclusion. After looking at this carefully myself, I have come to the conclusion that PayPal is the easy option (and therefore not surprisingly not the most cost effective). A serious on-line business needs to take its overheads seriously. A couple of people pointed out Google – thank you for that.
The criterion by which I have chosen, in the end two suppliers, are: cost, quality and brand (in that order). The two providers that I concluded offer the best deals are:
The two options represent a reliable cost effective option for an on-line business taking a larger number of smaller transactions.
Malcolm
We have always taken the Green side of life very seriously. At first glance one might not think that there is a great deal that can be achieved by an IT Consultancy such as ours to help the environment; or at least to reduce the impact of our actions on the environment. But, as time has passed I keep bumping into useful things – it seems to be in the zeit-geist more and more. The video below is taken from the Reuters website and talks about what companies are doing and in particular what products are selling at CeBIT 2008.
Malcolm
For those of you in the know, if you look carefully you may have realised that we use WordPress to build and manage our website. Website content management is something that is very close to my heart, as I’ve designed and built a corporate content management system (CMS) and helped to build many websites that use its functionality. That particular sledgehammer would not be suitable for manging the little nut, that is our website, so WordPress turned out to be a real asset in building the site.
Search Engine Optimisation is often miss-understood — placed on a pedestal and either feared or revered by those who need it.
I’ve been working on optimising the Colosa website. I’ve just started this process and our website has only just launched, so it will be interesting to see how things change over the next few months. What does it mean to optimise a website? Many people will tell you that this is a black art. The truth is rather more straight forward.
I’ve been using OpenOffice for years under Windows now and StarOffice before that, even on Solaris – might be more than 10 years? Recently my laptop died and I decided to get a new one and although I haven’t had an Apple Mac for about 8 years I decided to buy a Mac as now they run on Intel chips, I can run Windows on it – plus they look and feel great and I like the fact there is a Unix base under OSX. I thought that I would move to using NeoOffice (an OpenOffice derivative for the Mac with native look and feel) and soak up the Mac ethos by using all of their short cuts etc. All is great, but NeoOffice is very slow to start: in excess of 10 seconds after I start the machine and login.
I’ve alway found it (swearing in code comments) annoying and very unprofessional and I’ve brought a couple of developers that I’ve employed over the years up on this on more than one occasion. It doesn’t help and it encourages an I’m annoyed attitude rather than an I’m challenged attitude – and I don’t mean verbally challenged, either!
But, we are all human and swear from time to time and so when I came across this I had to put a link in our blog to it:
http://www.vidarholen.net/contents/wordcount/
I won’t tell you what it is – just have a look!
It’s interesting how VOIP is such a hot topic and yet how immature it is in terms of the slickness of its configuration. Many years ago setting up broadband connectivity at home with a fixed IP and an associated domain or domains, a firewall, NAT, DHCP, ADSL, PPP, etc. was a task for the dedicated systems administrator. Now you can buy a box with all of that built in for only a few pounds, indeed most broadband suppliers give them away to their new customers. And, configuration is easy: user-name, password and your providers domain – and all of the other settings are either handed off or have sensible defaults pre-set in the device. VOIP is not like this at all – yet. In a couple of years we’ll all be buying our ADSL wireless routers with our phone and configuration for VOIP will be as simple as ADSL is now.