Dont let Outlook mess up your emails

Added on: Wednesday 24th June 2009

There is a move going on at the moment to try to get Microsoft to remove the Word rendering engine from Outlook 2010. Whats more the campaign is being mounted using Twitter.

Back in Outlook 2003 (I think it was) Microsoft suddenly took a backward step and started using Word's rendering engine to display HTML emails.

All of a sudden many emails that looked perfectly good in earlier versions were totally different and in many cases unusable (as forms didn't render at all).

Although its been a slow journey to get all the major web browsers to support similar standards we are finally getting closer with the release of IE8.

The same can't be said of emails though - we are still using table based layouts and its very much a case of the lowest common denominator when it comes to features.

If you are a web designer or IT Professional you should definitely tell Microsoft to move forward rather than backwards.

But even if you just use email to keep in touch with customers you should also care because ultimately you want your emails to look the same whoever is reading it and you don't want to spend hours and hours testing each one.

Head over to Fix Outlook to voice your disapproval at this move. (NOTE that you will need a twitter account)

The site background shows all the Twitter users who haved 'tweeted' about it and also has an excellent example of how an email looks in Outlook 2000 and Outlook 2010.

Getting news feeds via email

Added on: Saturday 6th June 2009

Although most blogs and larger websites have rss news feeds it is thought that only 3% of computer users are familiar with how they work.

So an astonishing 97% of users aren't using feed readers to get up-to-date information from their favorite sites.

...and more importantly they aren't getting YOUR updates.

Feed My Inbox is a service which allows anyone to subscribe to any feed and get notifications sent to their email address.

Once you have confirmed your address an email will be sent each time the feed is updated, but no more than every 24 hours.

Website owners can also add a simple form to their sites to allow their feeds to be received by email.

I've added one to this site so you can try it out!

Restricting API usage on websites

Added on: Thursday 4th June 2009

It seems any web application worth its salt nowadays must have an API that developers can use to create mashups or display content within their web pages.

There are various ways that these APIs can be called from a web page - PHP users can use the CURL library - but the most common is an AJAX call via javascript as this can be used on any web page as long as the browser has javascript turned on.

The problem with the javascript option is that the call to the API is exposed on the page for anyone who wants to see.

This might be fine if your API is only for displaying content and doesn't allow interaction with a database but you still might want to restrict the usage so as not to overload your servers.

One of the most common ways to do this is to get users to register and give them a 'key' to include in any call. Your server then validates this key - in the case of Google Maps (version 2) the domain of the calling website is also checked to make sure the key hasn't just been copied from somewhere else.

A search for options on securing API calls brings up this method amongst several others but I haven't yet found a good article on how to put it into practice.

I am rolling my own API at the moment for our Contact Management Software and needed a way of protecting client data by ensuring that only registered websites use the API.

I initially thought of using the server variables exposed to PHP to check the originating site but that didn't work - probably because there are several redirects before the authorisation script.

Then I got thinking that there must be a way to do it with mod_rewrite.

I was already using mod_rewrite in order to set up a clean REST style url for calling the API and I wondered if there was anyway I could pass the calling page url to the script. It turns out there is.

mod_rewrite allows the use of variables to check for certain conditions and you can use the HTTP_REFERER variable in a condition to check if the user has come from a certain page.

I then thought that it might be possible to pass the value of this variable into the rewritten url and so make it available to the checking script.

In fact its easy and it works just include the variable within curly brackets and prefixed by the percent sign and it will then be part of the query string passed to the page.

eg http://www.slowducks.co.uk?d=%{HTTP_REFERER}

Its not foolproof as it can be spoofed and its not always available but my API will never be as popular as Googles (sadly) so I can handle the support issues.