To further help focus the attention on local Swedish APIs we are now starting a competition to find out which is the best Swedish API. After about 2 weeks of voting we hope to present the best API the country has to offer!
]]>We have noticed a real need for this during the last few years when we have been working with our customers and their APIs. There have been a real lack of a one-stop-shop to let developers find Swedish APIs and it has been hard for Swedish API providers to reach developers. Now we hope to have solved that problem…
]]>For the last few CodeIgniter projects (begagnadebarnkläder.nu – hitta billiga barnkläder for example) I have used two libraries that have made my life as a developer much easier and that I really recommend anyone using CodeIgniter take a serious look at. Those libraries are IgnitedRecord and BackendPro. They are not brand new, but they sure get the work done!
IgnitedRecord
IgnitedRecord is a ORM library that makes interacting with the database so much easier. Especially handling relationships between tables much easier to deal with, an example is getting all posts a user has written in a blog:
$posts = $user->related('posts')->order_by('name', 'desc')->get();
Nice and clean and very easy to setup, all you need to do is to have your models extend IgnitedRecord and then define the relationships in the model. Some other goodies are that it is very easy to make subqueries and nested WHERE statements.
Instead of going in to all the details here I think you should go and download IgnitedRecord straight away and get started, it will save you time.
BackendPro
Almost every project needs some kind of administration backend. With some frameworks (Django) you get this out of the box, not so with CodeIgniter unfourtunatly. BackendPro gives you that and so much more – it also helps out with things like user authentication and access rights, asset management, breadcrumbs, preference handling etc. Only the fact that it is an easy to setup and then handles basic user authentication (such as login, registration and forgotten password) is well worth the price of admission (which is nothing since it is all free). BackendPro also comes with the excellent Matchbox library, which lets you organize your code into modules.
Combining the two
Unfourtunatly the two do not fit perfectly together since BackendPro uses CodeIgniters standard database libraries, so if you need to do any database work with BackendPro itself you need to remember not to use IgnitedRecord. Getting the two libs to play nicely once they are installed is quite straight forward though…
Do you have any other favorite libraries for CodeIgniter? Please let me and everybody else know in the comments!
]]>WordPress MU – a bit too limiting
One solution to this is to use WordPress MU which allows you to run many blogs in the same installation. The problem with WordPress MU is that it is quite a hassle to install, and it requires the kind of server access that you won’t have if you run on a standard shared hosting account. Another problem with WordPress MU is that it is usually a few versions behind the regular WordPress, so the latest and greatest (and most secure) features and plugins won’t always work. The WordPress people have stated that moving forward WordPress MU will be one of their areas of focus, so I expect that all the negative stuff I just mentioned will be gone in a year or two, but I want a solution now!
WP-Hive to the rescue
Lately I have started to use the WordPress plugin WP-Hive to solve my multimple WordPressblog problem, and so far it has worked very nicely. Install WordPress and the plugin according to the plugins documentation, it requires some copying of files but otherwise it is pretty much like the standard WordPress install. After you have one blog setup with WP-Hive you can add more blogs to the same WordPress installation, each blog with it’s own domain name and it’s own unique settings. The great thing is that all the blogs you install will share the same basic WordPress installation, the same plugins and the same themes. So you only need to update WordPress or a plugin once and it is updated for all blogs, that is a real time saver. Since WP-Hive is a WordPress plugin you can also use the latest and greatest WordPress verison and all the other plugins that you want.
At the moment I am running some Swedish wedding sites this way, for example Bröllopsinbjudningar, Bröllopsmeny and Bröllopsbukett. They all use the same plugins and the same theme, but all have different settings and different content. With some help from WP Super Cache all my blogs are running nicely from a shared hosting account. The only negative thing I have noticed is that if you have several domains starting with the same letters (“br” in my case) you need to do some manual setup in the database to get things to work.
Do you use something else than WordPress MU or WP-Hive? Tell us all about it!
]]>
<a href='http://www.digitalistic.com/tracker.php?id=123'>WebHostNinja</a>
The problems with this is that it is slow (one more page load) and that the link the user sees does not lead to the page, ie the link text “WebHostNinja” and then the user expects the link to be something like “http://www.webhostninja.com” and nothing else. I don’t like this from a usability point of view. Not that I have any evidence for it, but I suspect that relevant outgoing links help the sites Google juice. Definitly it helps the Page Rank of the site I link to, and why should I not help them?
My solution is to use some jQuery magic. Download and install jQuery (btw, if you like jQuery, then take a look at the fantastic jQuery tools). Then add the class “track_this_link” to each link tag that you want to track as well as give each link tag an unique id (so you can track which link is which). The href attribute should point to the external site directly and not to some internal page. The link above would now look like this:
<a href='http://www.webhostninja.com' id='123' class='track_this_link'>WebHostNinja</a>
Next step is to add a some javascript that adds an onClick event to all the links with the class “external_link”:
$(document).ready(function(){
$('a.track_this_link').click(function() {
$.post(http://www.digitalistic.com/tracker.php
, {id:this.id});
return true;
});
});
This means that each time an external link is clicked a post request is sent to digitalistic.com/tracker.php with a unique id for the link in question. What is left is to implement the tracker.php script so that it can handle the post and save the data you are interested in to the database in a secure and correct way. I am happy to just summarize the number of clicks per link on a daily basis, but if you want to save detailed data for each click.
Do you have a better solution to this problem? Let me know!
]]>There have been versions in Swedish, Spanish and English (even a breif French one) depending on where in the world I were at any given moment and the designs have varied quite a lot, as you can see below. A few years ago I started blogging and since then digitalistic.com has been a perfectly OK blog. Here’s a look back at old digitalistic.com designs…
1999 – 2002
Not much to say… it was one of my first web pages and I have learned a lot since then. At least I didn’t have a spinning @ sign or any applets on the page!
2002 – early 2006
Here’s the Spanish version of the second generation of digitalistic.com. I still use the logo from back then, even if the colours have changed (thanks Kemie for the great logo!).
2006 – early 2008
Then the blogging started…
(Both Pownce and Popfly that I had invites to back then are not dead – RIP Pownce!)
2008 – 20??
You are looking at it! (or if you are reading the RSS feed, you could be looking at www.digitalistic.com)
I have no plans to let digitalistic.com go and it will be my main domain for quite some time. It will be cool to see how it develops during the next 10 years (assuming we still use domains in 10 years that is).
]]>As you noticed I didnt mention trends like the end of the newspaper industry as we know it or mashups or anything like that, I do think that the trends above do have a bigger impact and/or are more interesting.
What do you think? Did I get things somewhat right or completely wrong?
]]>I run OSX 10.5 and use Sequel Pro as my MySQL GUI, but if you use something else the approach should be similar. I assume that you do have SSH access to the remote server and also access rights to the database.
Don’t you feel a bit happier already?
]]>Even of Coda is great, it is not so great that it can not be made greater (so much for simplicity, hehe), which is pretty simple since Coda allows for plugins. In the Coda Developer Zone there are a number of plugins listed, and if you look around on the web you can find even more. Also, you can easily add new code completion, reference books and other goodies. This is a list of the stuff the extra stuff I have used and am very happy with so far…
URL Encode
This is a a very simple but very practical plugin that allows you to highlight some text in your HTML files and then URL Encode it. As a Swede using a lot of words with åäö it is very usefull.
PHP Toolkit
This plugin makes it easy to validate and clean up PHP files.
CodeIgniter Syntax Mode
Code completion with CodeIgniter classes and functions, a must if you are using Coda to develop CodeIgniter applications. You can download the file here and read more about it in this thread in the CodeIgniter forums. I have made this syntax mode my default one for PHP files since I hardly do any PHP that is not CodeIgniter anymore.
Extra books
It is easy to include help files about programming languages etc in Coda in the form of “books”. Out of the box Coda comes with books about PHP, HTML, CSS and Javascript, but it is easy to add more. Here is a great list of more books you can include in Coda, complete with icons and all. Personally I have added CodeIgniter and jQuery so far, but I am sure some Django, Drupal and WordPress will sneak in as time goes by.
What are your favourite add ons to Coda? Please let me know if I have missed something I just must have!
]]>For now it is just a temporary site to start getting the name out while we are working hard on the upcoming feature rich real site, hopefully within a few months. If you want to us to tell you as soon as we have the real F1Almanac.com up and running then please sign up to our newsletter. At the moment we are mocking up screens like crazy, and soon it is time to write some code.
If you have any ideas how to make F1Almanac.com as great as possible please let me know!
]]>So now it is time for me to try out the domain market, not just the buying side but also the selling side. Not sure how to evaluate domain names, but I am taking a rough stab hoping that I am not too greedy and not too naive. Hoping that at least we can get some nice money to spend on Google Ads for our wedding project.
These are the domains I have, all for sale at MissDomain.com in 2 portfolios (Victoriasbrollop and Kungligtbrollop):
Let’s see what the domain fairy can bring me…
]]>Basic setup
I assume you have WordPress installed on your local computer, including a MySQL database and all. There is a really good guide on how to do this on WordPress.org. Once installed I also assume you have played around with your theme and settings and gotten the WordPress site to look and work just the way you want it.
My second assumption is that you have an account at a web host that supports PHP and allows you to setup a MySQL database. If you don’t then you can easily find many good cheap options via my web hosting price comparison site WebHostNinja.com. Many web hosts have one-click installs of WordPress, but this is nothing you need right now.
My third assumption is that you have a domain or subdomain where you want to have your fantastic WordPress site installed on. In this post I use the target domain name f1almanac.com, since that is the latest WordPress site I have deployed. Of course you need to replace “f1almanac.com” with your own domain name in all examples below.
Move the files
The first thing to do is to copy all your WordPress files from your local computer to your webhost. In my case this is all the files under /projects/f1_wp/ that I move to the directory on my host that corresponds to the domain I have choosen. For now just move all the files, no need to change anything in any file.
Move the database
Next thing is to move the database structure and all it’s content from your localhost to your web host. First of just do a MySQL dump of the structure and content of your WordPress schema. This can be done in most MySQL GUI applications. Personally I use Sequal Pro and there the MySQL dump option is hiding under File->Export. Refer to the help files of your MySQL GUI app (or MySQL command line if you are hardcore) how to do a dump. The dump should result in a .sql file containing SQL statements to create all tables needed as well as inserting all the data needed into those tables.
Now we need to change some stuff in that .sql file. Open the file in a text editor and replace all local URLs to the URL of your new site. For me this means changing “http://localhost/f1_wp” to “http://www.f1almanac.com”. Without doing this your production WordPress installation would refer back to your localhost, and stuff would just not work. As always with search and replace, take it easy so that you dont break anything.
Create a new MySQL database on your web host, and open phpMyAdmin (or MySQL client of choice) for that database. In the “Import” tab of phpMyAdmin you can import an SQL file, so choose your newly edited .sql file and click “go” to import it. This creates all the tables needed and fills it with all the content you need, this includes pages, posts, plugin settings etc.
Change database configurations
At the moment the WordPress installation on your web host do not connect to the newly created and populated database, to do that just open wp-config.php on the host in a text editor (this is one of the files you uploaded to the host ealier). In the top of the file you find all the DB settings, so change DB_NAME, DB_USER etc to correspond to your new MySQL database and not your local database.
Once that is done you should have a fully working WordPress installation on f1almanac.com, or at least on your own domain
Final touches
Now things are working fine, but there are still some final touches before all is done. First of all your should probably login to /wp-admin on your newly deployed site and change the password of your admin user. I use extremely simple passwords on my localhost while developing, but I do not want to use simple passwords when things are live. So if you work the same way as me go and change the password to something harder to crack than “guest”…
Last thing to check is that your media files are uploaded to an existing directory. Login to the WordPress controll panel and go to Settings->Miscellaneous. It is very likely that the “Store uploads in this folder” is set incorrectly, since it was set when you installed WordPress on your localhost. Change it to the default “wp-content/uploads”, otherwise you will not be able to upload media files successfully.
That’s it. This is a technique that has worked fine for me many times, but I am sure smarter people than me has better solutions to this. If you are one of those smarter people please share them with us all in the comments of this post…
]]>If featureset and usability would decide my choice then Jaiku is the winner, but of course community is the ruling factor. Since the communities I want to be part of will continue to be spread out over many microblogging services (maybe even more so once Jaiku goes open source), I think I need to be active on several services for a foreseeable future.
Aggregated Consumption is not the same as Communication
There are many good services for consuming the posts from all my contacts across several microblogging services. FriendFeed is a great example, that also adds a lot of neat features (again, I am andreaskrohn). Some desktop clients also let me consume messages from several services, Twhirl lets me subscribe to posts both from Twitter and Jaiku for example. Jaiku let me subscribe to RSS feeds, so that is a way for me to get my posts from other services into Jaiku. Bloggy has the very nice feature of letting me input my Twitter and Jaiku data and then all my posts from those services are also shown in my Bloggy feed, as well as all my Bloggy posts being posted to Jaiku and Twitter.
All this solutions do have a common problem though, and that is that they are missing what is key to microblogging – it is all about communication and communication is a two way game. From Twhirl I can only post to Twitter and not to Jaiku. My comments in FriendFeed can not be looped back into Twitter/Jaiku/Bloggy. From Bloggy I can read Jaiku posts, and I can post to Jaiku, but not participate in a threaded conversation on Jaiku. So consumption is not a problem, but communication across several services efficiently is.
By now I am sure several of my readers are thinking “skip all other services and just use Twitter and stop complaining”. This might be a good strategy if I only wanted to communicate with the Twitter crowd, but belive it or not there is acctually a world outside Twitter. Also, I am not complaning (not so far at least), just explaining a problem. A problem that is solvable!
Who are we Communicating with?
So what is the solution? As I see it the best solution would be a microblogging client that can do two-way communication with several microblogging services. So how would this client work if I could dream up a whishlist… To get to that wishlist let’s first think about how we divide up the people we communicate with, and personally I come up with these criteria:
All microblogging services and clients address some of these criteria . Groups can be handled by Jaiku channels for example. You can communicate to individuals via the “@” notation or by directo messaging. By posting Swedish posts to Jaiku and Bloggy, and English posts to Twitter the language criteria is somewhat dealt with. But this is not a natural workflow, and it sprouts conversations about the same subject in many different places.
Whishlist for the perfect microblogging client
The perfect microblogging client would allow me to communicate across services in a fluent way. It would detect what language I am writing my messages in and only send it out to the people that understand that language, no matter what microblogging service they are using. Geography and social groups should also be handled seemlessly across microblogging bounderies. It would be able to aggregate all comments made to one message into one place to create one and only one joined conversation (minirant: why oh why don’t Twitter have threaded conversations?). If I can continue dreaming the perfect microbloggin client would not only contain microblogs but also the good old chat networks – MSN, Jabber etc – where I still have quite a few contacts. My hope is that this already exists, and that it just have passed me by, if so please let me know!
Faithful Digitalistic readers might remember that I ranted about this almost a year ago in the post The Need to Mashup Twitter, Pownce and Jaiku, but since nothing much has changed I took upon me to rant once more. Maybe in a year I will acctually do something about it myself, or maybe I will just write another rant
I am not sure what to do with them, but I think they all have potential. Especially I love MashupCookbook.com. Unfourtunatly I do not really have an idea that is good enough (read: fun + making money) yet. Do anyone smarter than me have an idea?
]]>Configuring the Search Engine in Googles control panel, adding sites etc is a walk in the park. Integrating a Custom Search Engine into the site was a bit tricky though, the documentation is far from perfect. The trick is to host the search results on your own site in an iframe (set in the Custom Search Engine control panel under “code”) and knowing that this iframe is generated by Google’s javascript when it is time to show the search result. For a while I was unsuccessfull in styling things since I tried to create my own iframe, but there is no need to complicate things like that. Also good to know is that running a Custom Search Engine locally works so-so, I got a lot of “The URI you submitted has disallowed characters.” error messages when running from localhost, but things worked perfectly once I uploaded it to the production server.
For the time it took (hours, mostly spent on learning the magics of CSS) it is an impressive functionality on MashupSpy, I will definitly use Custom Search Engines more in the future. If you have ideas on how I can improve MashupSpy or if I have missed any sites (see the full list on mashupspy.com) in my Custom Search Engine then please let me know!
]]>It was extremely easy to get the interaction with Twitter to work using Twitters super simple Search API. As a basis for my PHP code I used Simon Maddox’s Codeigniter Twitter library to speed up development even more. Basically the backend is nothing more than a cron job pinging Twitters Search API every few minutes for new tweets and then saving them in a database. Considering doing something more with this data at a later stage, but for now I just show it on the site. What took the most time was not figuring out how to use the Twitter API or to write the few lines of PHP needed. What took me time was to get the site to look good using CSS, but I am quite happy with the end result. Hopefully I can do some more of my own CSS work in the future.
MashupCrowd has already proved usefull for me in finding new innovative mashups. Today I found a cool use of Twitter to track the snow depth in the UK.
]]>Google redeem yourself quickly and give us the gift of an unlimited GDrive service (also, please do not look at the files I store there).
]]>Those are the ones lined up so far, but I suspect I am going to add more to my plate. During 2009 I will really try to do fewer projects and spend more time on the ones I already have going – like enlatele.com.mx.
One thing I do not promise for 2009 is to be a more frequent blogger, I do not want to promise things that I am not sure I can keep…
]]>Before getting started I want to thank Alexis Bellido at ventanazul.com for his patients with my Django questions, check out his post Django questions and answers with a Swedish guy for some more info on Django setup (if you didnt figure it out I am that Swedish guy).
Installation guides
When I set up Django locally I wanted to make my local development environment as similar to a production environment as possible, it will hopefully make production deployment easier down the line. To do this I am running Python 2.5, Django 1.0, PostgreSQL 8.3 (and thus the neccessary python driver psycopg2) and mod_python on Apache. There are some good installation guides out there, so setting up this stuff was mostly a walk in the park, check out the install guide in the Django Book, the quick install guide at Django Project or the install guide at WebMonkey. Of course there are some gotchas that took me quite some time to figure out…
Uninstall old versions of Django
Before getting started I already had Django 0.96 installed, and I just installed Django 1.0 over it assuming that it was going to replace the old version. I was wrong and when I tried to run my new Django installation I got errors like “NameError: name ‘url’ is not defined” and “ImportError: cannot import name WEEKDAYS_ABBR” – neither which made much sense to me. It turns out that if you do not uninstall an old Django installation first you get a mix of new and old Django files, and that just do not work very well. So take care to acctually read and follow the instructions at Remove any old versions of Django.
Use the right version of the db driver (duh!)
To get the PostgreSQL working correctly with your Python install you have to use a version of the driver Psycopg2 that matches both your Python version and your Apache version, otherwise things just do not work and you do not really get a helpful error message. You can get the window versions of Psycopg2 here.
Configure Apache for mod_python
Once Django and Apache are installed you can create your new Django project (via django-admin.py startproject myNewProject
) where Apache can find it (in htdocs for example). To be able to use Apache as your webserver you also need to install and configure mod_python. Installing is straight forward (see the install gudies above for more info), but configuring Apache took me some time. You need to edit httpd.conf and add the following to the end of the file:
<Location "/myNewProject">
SetHandler python-program
PythonHandler django.core.handlers.modpython
SetEnv DJANGO_SETTINGS_MODULE settings
PythonOption django.root /myNewProject
PythonDebug On
PythonPath "['C:\Program Files\Apache Group\Apache2\htdocs\myNewProject'] + sys.path"
</Location>
myNewProject is of course the name of your own Django project.
Hopefully this is of use to someone other than me, if not, then at least I have my notes organised for the next time. If I have missed anything or gotten anything wrong I would very much appreciate your feedback!
]]>This post was originally published in swedish on Mashup.se, my blog about swedish mashups and APIs.
Only Python
If you don’t know Python you don’t have choice if you want to use Google App Engine, you just have to learn. It’s not a difficult programming language to learn if you already know how to program. Django, the Python framework that really speeds up writing applications for Google App Engine, is also quite easy to pick up (tip: Use the Google App Engine helper for Django). Before I started with Google App Engine I hadn’t written a single line pf Python and after a few intense days of concentration and coffee consumption I knew Python, Django and Google App Engine quite well.
Most 3rd party python libraries work perfectly on Google App Engine (Django is just one example), but there are limitations. Only libraries that are 100% Python can be used, so if the library has any code in C it can’t be used on Google App Engine. If a librarry has any code that makes a HTTP request or similar it can’t be used on Google App Engine either. All HTTP requests have to be done via Google App Engines URL Fetch API.
Differences between local dev environment and the production environment
One of the advantages with Google App Engine is that there is a local development server that simulates how Google App Engine works when an application is in production. This makes it easy to develop an application locally and then deploy it to the live servers on Google App Engine. As a developer it is important to pay attention to the differences between executing an application on the local server compared to the production environment. It’s no fun to spend time writing code that just don’t work once deployed.
The biggest difference betweent the two environments are that the requests made to 3rd parties work differently. If you have an application that uses the Delicous API it will work fine locally, but once deployed it wont work at all. The reason is that Delicous is blocking all requests from Google App Engine IP-addresses. Same thing is true for the Twitter API due to some HTTP headers that Google App Engine sets (Twitter claims to have fixed this now, haven’t had a chance to test yet). To avoid these problems you need to test your application often in the live environment, especially when you use APIs to call other services.
Datastore
The Google App Engine datastore has some limitations due to being a distributed database. The most obvious limitation is that there isn’t an “OR” operator in GQL (the Datastore version of SQL), but that is easily handled when coding. A more annoying limitations is that it is not possible to create a new entity (data object) in the Datastore via the Google App Engine dashboard unless there already exists an entity of this type, and that there is a real noticable delay between what one can see in the Datastorer in the Dashboard compared to what really is stored in the Datastore. This makes it very hard to really check what data that is stored in the Datastore in any given moment, which makes debugging more difficult.
No scheduled processes
One of the most restrictive limitations with Google App Engine is that there is no way to start a process, other than via a HTTP request. There is no type of recurring scheduled process (like a cron job) and no triggers or hooks to use to start a process when a special event occures. Almost all web applications need som kind of scheduled process to function correctly – to clean up old data, send emails, consolidate statistics or fetch data from an RSS feed once an hour.
The easiest way to get around this limitation is to create a cron job on another server that calls an URL in the Google App Engine application. If you have a lot of visitors you can also perform the background process as part of a user request. For example you could import data from an RSS feed when a user logs in to your application. This will of course make the user experience slow and there is no guarantee that users will perform the action to trigger the background process att the right times.
No matter which solution is implemented one will quickly bump into the next limitation of Google App Engine, that only short processes are permitted.
Only short processes
Something I have learned the hard way is that Google App Engine only is built to handle applications where a users makes a request and quickly gets an answer back. There is no support for a process that executes for a longer time, and for the time being this limitations seems to be approximatly 9 secondes/process. After that an exception is thrown and the process killed. It does not end there, even if you are nowhere close to use the assigned CPU resources you can quickly use too many resources with long running processes since they have their own unique resource pool. If you use too many resources you application is shut down for 24 hours, and right now there is no way to buy extra resources.
This is a really serious limitation in Google App Engine that it is really difficult to get around. If you need heavy processes it is recommended that you use Amazon EC2 or something similar. To handle (not get around) the limitation you have to handle exceptions in a nice way and use transactions. More about this in William Vambenepe’s very informative post Emulating a long-running process (and a scheduler) in Google App Engine. He has some tips on how to get around this limitation, even if it is not recommended since you risk that your application is shut down.
More limitations
There are more limitations, read Google App Engine: The good, the bad, and the ugly? for a longer list. Just keep in mind that some of the limitations mentioned in that post already have been addressed by Google.
To summerize you need to really know what limitations there are in Google App Engine before you spend time and energy on developing an application. If the limitations are not a problem then there is a lot to gain by using Google App Engine.
]]>