Author Archives: Inash

The Magazine

Originally written on 21 January, 2009 01:33 AM for the MOSS Magazine Issue #2 (08 February, 2010), for the “/dev/null” column. I’m republishing it here so that it will be on the public domain as well.

This is an idea sprung out from one of our casual coffee meetings. Before I delve into the details of how we came about conceiving this idea, I would like to clearly mention the purpose and goals of the Magazine.

First, the main objective is advocacy and getting the community involved, at least in sharing articles and bits and pieces of information related to free and open source software as well as getting technical articles targeted towards potential developers in the open platforms arena. In this way, the community will slowly begin to grow from self-help to a more community oriented knowledge sharing model. That is when we’ll be able to see actual local expertise kick in and which will eventually pave a way for commercial support bases.

The second objective of the Magazine is as a window for MOSS to generate a revenue as a funding source for it’s various operations. Although we have not held a major public event, I can proudly say that as much as MOSS have thrived, it has without any financial support. Through selective advertising on the Magazine, we intend to open a window for entities which support FOSS in some way to expose themselves. Not just any entity, as of currently, we’re looking for commercial support companies who will or who’re looking into embracing open source products and services as a business.

Coming back to the original idea, we were brainstorming about a possible way for MOSS to derive as little as much a revenue for it’s operations. One of our regular members proposed the idea about the Magazine and was pouring his ideas in about how we’ll go about doing it, the kind of advertisements we’ll support, etc. After a short while, when the rest of the members started to get the picture, the conversation was bloated off the scale. We then went on to discuss about it for a couple days more, and the topic was finally brought up in one of our weekly meetings which made it initially into the minutes, on 13 December, 2009.

Starting onwards from that specific moment, key members were able to stir up some articles and work on the layout and design of the magazine which came out on 8 January, 2010. It was a huge excitement for us when we first received the draft compilation to glance over the arrangements of what will come out on the final date.

As we speak, some of the dedicated members of the community are still working on writing articles, how-tos, interviews and experiences to bring together more about FOSS here in the Maldives. The Magazine has become sort of a catalyst for us in our main mission as advocating FOSS.

The Magazine has a guideline on submission of articles from the community. We’re yet to work on detailing the policies and processes involved with compiling the Magazine for the community as well as the Magazine committee. We’re also looking for editors who might be willing to professionally work with us in doing an editorial for the Magazine as well as who understands or is interested in understanding the FOSS concepts. We’re having wild ideas running here and there and would very much like the support from the community to take those ideas to the test.

We’re also doubtful about whether we can sustain the Magazine release cycle monthly. It is quite a bit of work, and if we do not receive adequate number of articles by the deadline (end of the month), we just might have to wait another month before releasing one. But we really want to stick to a solid periodical release cycle; monthly, bi-monthly or quarterly; monthly at best.

So in this rant, I wanted to mention about the Magazine and say that this has been one of our very successful projects so far, and it is yet to be seen how far this goes. We’re very positive about it and we’re giving it our all.

Building from Source Tar Files

Originally written on 04 February, 2010 01:46 AM for the MOSS Magazine Issue #2 (08 February, 2010). I’m republishing it here so that it will be on the public domain as well.

This how to shows how “.tar.gz” files are used in general and what they are. We received an email request from one of the readers asking for an article on how to work with “.tar.gz” files and how applications distributed in tar files can be installed and made use of.

Basically “.tar.gz” files or simply called a “tar file” or “tarball” is an archive format. It usually comes compressed in a format available generally on a GNU/Linux system such as gzip, bzip2 or lzma. A command line program called “tar” exists for the purpose of creating and handling tar files. Simply put, a “.tar.gz” file serves the same purpose as the “.zip” archive format.

Typically GNU/Linux programs are distributed in this format. Most follow the convention of using “program-name_1.0.1_src.tar.gz” for the source code archive and “program-name_1.0.1.tar.gz” for the binary compilation.

Let’s begin with using these files on the latest version of Ubuntu. We’ll also download a small tool as a sandbox to have a look at how programs are built on these platforms from source code. At this point, it is necessary to know how to distinguish source tar files from binary tar files. That way, it would be convenient to learn earlier if a binary package is already generated for the distribution you’re using. For example, on Debian and Ubuntu derivatives, programs are packaged as “.deb” files. Which means that you do not need to download the source tar ball, extract it, configure and build it from scratch.

Let’s download a small utility that let’s you test the performance of websites. The tool is developed and provided by HP systems. You can download the source code at Once done navigate to the downloaded folder from the command line, eg: /home/user/Downloads/. Issue the following commands to extract it and going about building it. Note that this is a very primitive way of building most programs on GNU/Linux and it should almost be the same for most programs out there.

Let’s go through what’s happening above. The first line runs the “tar” program which handles tar files. The second part “-zxvf” are command line options which tells the tar program what to do. The third argument is the name of the tar file to perform the actions on. You can do the same by right clicking on the file with the Gnome file browser (Nautilus) and selecting “Extract Here”.

The command line options are:

  • z: filter the archive through gzip. Since the tar file is compressed with the gzip compression format, which is indicated by the second file extension of “.gz”.
  • x: extract files from an archive.
  • v: verbosely list files processed (optional). This displays a list of the files that are in the archive and which were extracted.
  • f: use archive file or device. The argument following the options. In our case, the third argument, name of the file.

Now if you do an “ls” or browse to the Downloads folder on your system through Nautilus, you will find a folder named “httperf-0.9.0”.

The second command “cd” changes the current working directory to the newly extracted directory. We then create a folder named “build” with the third command. We change into that newly created “build” directory with the fourth command.

The fifth command is special, in that it configures the source code to be built for your specific distribution. Since different systems have different types of file system standards and different environments, the configure script knows much about the differences and prepares things appropriately.

The sixth command actually tells the system to start compiling the source code to create binary files that can be executed on the system. At this point if you view the “build” directory you will find new files and a “src” folder. If you navigate to the “src” folder, you will find different intermediary build files used by the “make” program and the actual executable named “httperf”. We can run the program here by issuing “./httperf –help”. It will run and display the help information for the program.

The last line is also special in that it actually copies the necessary files to the system paths. It installs the executable in the system’s locally built binaries directory “/usr/local/bin/”, same for the “idleconn” program and finally installs the man (manual page) in “/usr/local/share/man/man1/”, which can be viewed by executing “man httperf” on the command line.

There you have it. You’ve successfully built and installed a program on your system. Now at anytime, you can run the “httperf” program from your command line. This is a typical program build process as mentioned before. It can simply be uninstalled by executing “sudo make uninstall” from the same build directory (“/home/user/Downloads/httperf-0.9.0/build/”).

Now for the difference between a binary tar file. You can extract any type of “.tar.gz” file with the first command as mentioned above. If you list the files extracted with the command “ls -l” it will display the directory in a list fashion with the file permissions, owner, group, file size and the date modified as columns. If there are any files that are in bold or with the an “x” in the file permission block, the file can be executed. All you have to do is type in the command “./program-name” and the program will get executed. A none source tarball will not have the “configure” script and files like “install” or “Makefile.*”.

You can find more about working with tar files by doing a search on the web, which will land you with different sites and weblogs which shares on how you can go about working with tar files as well as building and running programs distributed in tar files.

A Community Coming of Age

This was originally written for the MOSS Magazine Issue #1 (08 January, 2010), for the “/dev/null” column. I’m republishing it here so that it will be on the public web as well.

Originally written on 25 December, 2009 10:56 AM

It has been an year since this little community began to become active and solid, the community being Maldives Open Source Society (MOSS). This was because of activities which began in two waves. One which spawned off after the Google Groups mailing list was created back in December 2008 together with a post on the 2nd of January, 2009 titled “1st MOSS Meetup”, after the name was first coined in on the 27th of December, 2008. The second when the translation effort was taken for a spin on the 14th of April, 2009.

Since then most things have been discussed and come to terms through brainstorming and voting on the mailing list. This shows a strong tendency to keep things loosely coupled and in a very community oriented way with no special group of people deciding on the direction of it.

It was then decided to register MOSS as an NGO which was completed after a very long process on the 14th of July, 2009, which was a very joyful day for all of us since the legal aspects of our biddings can now be accepted, realized and accounted for. We held our first general assembly within the week on the 19th of July, 2009 and members for the steering committee were elected. The general assembly was held in the evening at Ameeniyya School and was attended to by one of the honorary members of the community as well as new faces.

We then went on to cater to some floating ideas here and there within the community together with working towards penetrating our cause within the government. By trying to convince the government to indicate equivalent opportunity for FLOSS in their software procurements, to mandating technical policies through the legislature. We introduced ourselves to the government’s technical bodies and the ministry that is concerned with, Ministry of Civil Aviation and Communication. A presentation was prepared by a core team and was presented to the decision making people of those bodies and explained what FLOSS is, along with the benefits it can bring to the government as well as the society at large by adopting emerging technologies which are being embraced by FLOSS.

We were awarded a pilot project to migrate a department to FLOSS and to prove to them the equivalent capabilities of FLOSS and as a test bed to find implications that may arise when it will be adopted government wide. It was a successful project and as with small implications. But in general the issues that were faced, were little compared to the value lost in terms of time and effort in managing commercial infrastructures.

Some plans on public events have not been much successful yet, but there was a little event on the 11th of July, 2009 which was “World Population Day 2009”. A mention about the possibility for MOSS to grab a stall from the event to promote as little as much of it’s cause drove a few to prepare for the event over night. It was a casual and a little event, but the members involved were able to connect with the children during the evening and convey a whole new dimension of possibility to them. We even got to promote the Edubuntu collection of educational applications and games and it was a huge attraction.

Software Freedom Day 2009 which is a global day celebrated amongst the FOSS communities for it’s name sake overwhelmed us and we ran out of time to completely prepare for it and to properly get organized. So it was not a successful endeavor but I’m sure it will be for real this year as most of our focus is on advocacy to foster awareness about our cause. We’re preparing for a public event soon, although I’m not exactly sure when that will be at the moment.

So that’s a very little bit about the history of MOSS this year wrapped up and briefly packed. What I would like say is that people are getting involved and we’re seeing new faces every once in a while. This is completely participatory and not compulsory. In this way, we’ll be able to grow and create an environment for people who’re born through software, whose lives are relative to the software they use and are able to share and contribute to that world in some way they can.

The best thing about the kind of community we are, is that regardless of the aspects of MOSS as an NGO, people are able to come along, relate to what they’re interested in, do they’re biddings and go away for a while. And if we’re able to see that pattern more often in a repetitive manner, I believe that MOSS as a community has come to foster.


What is the advantage of choosing Free and Open Source software over proprietary software?

Many studies have found that FOSS is less vulnerable to attacks and malware than proprietary systems. And when vulnerabilities are found, they are usually patched faster.

In a 10 month stress-test in 1999, Windows NT (the top windows server platform at the time) crashed an average of once every 6 weeks. None of the Linux servers crashed during that entire period.

Open standards and vendor independence
Open standards give users, whether individuals or governments, flexibility and the freedom to change between different software packages, platforms and vendors.

Reduced reliance on imports
Proprietary software is almost always imported, with the money going out of the country. FOSS is usually financially free, and can also be developed within the country.

Developing local software capacity
With it’s low barriers to entry, FOSS encourages the development of a local software and support industry.

The easily-updatable nature of FOSS allows for the fasst creation of software that is tailored to the local language and culture. This is almost impossible with proprietary softawre.

Note: Taken from a flier I got during Apache Asia 2009 Roadshow in Colombo. Supplement prepared by The Linux Center, Sri Lanka.

Apache Asia 2009 Roadshow (Day 1)

Today ends the first day of the (partly) 3 day seminar of Apache Asia 2009 Roadshow at Colombo, Sri Lanka. I was anticipating a lot on attending this event and now that I’ve successfully been able to, I’m grateful for those who’ve put aside time for me to get out of Male’ and attend to it. It’s partly 3 days because the 3rd day is supposed to be an unconference. It’s a clever term and it means unwinding the event with a participant driven conference centered around a theme or purpose and is primarily used in the geek community.

The event has 3 keynote speakers one of whom is a distinguished Sri Lankan professor named Mohan Munasinghe. He is a physicist with a focus on energy, sustainable development and climate change. He was also the Vice Chairman of the Intergovernmental Panel on Climate Change (IPCC), the organization that shared 2007’s Nobel Peace Prize with former Vice President of the United States Al Gore. He talked about sustainable ICT and the environment and climate change in general. You can read more about professor Mohan Munasinghe at Wikipedia.

Following the keynote speech by professor Mohan Munasinghe, Greg Stein delivered his keynote, “Reflecting on 10 years with ASF”. He is a director of the Apache Software Foundation, and served as chairman for a couple years in the past. His talk was particularly interesting as his focus was on relaying identified key elements during his career as a developer, how he came to be a director of the ASF and then chaired the foundation for sometime. He then went on talking about his past experiences and how the audience can similarly relate to him pursuing a similar path.

The rest of the talks were from experienced Sri Lankan developers who have been regularly contributing and driving the course of certain projects of Apache: Axis2, Apache Synapse, Stonehenge and Apache Woden to name a few. How I wished we have contributors back at home.

To emphasis on some of the talks, I could say they were particularly interesting especially because these projects solely target enterprise Service Oriented Architecture (SOA). It means that middleware applications such as those enable in-house developers to tap into and expose data on disparate heterogeneous systems (be it legacy) to be consumed, transformed and utilized by more modern interfaces, and enables to create an interconnected platform.

These systems makes use of industry standard protocols such as SOAP and WSDL, and is guaranteed to interoperate with other stacks, platforms and libraries. Given enough thought on some of these projects are certain to be fruitful for enterprise developers who works heavily on integrations.

For questions asked from the audience at the end of talks, shirts and hats were awarded. As for the unconference, I’m not sure how they selected members. If I remember correctly 25 people were selected from the organizing team. To my surprise my named was called out almost at the end of the day and I was awarded with a cap and invited to the unconference. I’m assuming my contact at Suchetha Wijenayake who was on the organizing team must have given out my name.

The agenda can be found at the event website. I shall follow up with the events of the next days of the conference. Cheers.

Linux magazines and DVDs giveaway

Over the course of 2007 I have collected some Linux magazines and DVDs that comes along with it. I’m giving them away now that they’re out dated for me and since they occupy some of my bookshelf.


Here’s a list of issues with links to it’s web page:

  • Linux Magazine. Issue 73. December 2006.
  • Linux Magazine. Issue 74. January 2007.
  • Linux Magazine. Issue 75. February 2007.
  • Linux Magazine. Issue 77. April 2007.
  • Linux Magazine. Issue 78. May 2007.
  • Linux Magazine. Issue 79. June 2007.
  • Linux Magazine. Issue 80. July 2007.
  • Linux Magazine. Issue 81. August 2007.
  • Linux Magazine. Issue 82. September 2007.
  • Linux Magazine. Issue 83. October 2007.
  • Linux Magazine. Issue 84. November 2007.
  • Linux Magazine. Issue 85. December 2007.
  • Linux Magazine. Issue 86. January 2008.
  • Linux Magazine. Issue 87. February 2008.
  • Linux Journal. Issue 158. June 2007.
  • Linux Journal. Issue 159. July 2007.
  • Linux Magazine. Volume 10. Issue 2. February 2008.
  • Linux Magazine. Volume 10. Issue 3. March 2008.
  • Linux Magazine. Volume 10. Issue 4. April 2008.
  • Linux Magazine. Volume 10. Issue 5. May 2008.

That’s about it. All the DVDs are from the German Linux Magazine which is known in the US and Canada as Linux Pro Magazine. The second list of Linux Magazines are from, which dates way back to 1999. Cheers.

Update 2009-11-23: I’ve given them away to @iharis. Bug him to give them away if any one is interested. He’ll be giving them away once he’s done with them.

Blog Re-entrant

In an attempt to hopefully resume blogging, I’m re-enabling and re-entering the blogosphere. A new year comes with new resolutions including those of the past, and making old resolutions stick and work through should be amongst the new ones.

I’ve never been good at speaking my mind. My English literature is quite poor too. With new projects and prospective biddings within the community, I intend to share the endeavors and try to make those be heard through this portal at the same time try weaving a firm net below my literature so words won’t fall off before they hit your ears.

Firstly an attempt at reviving our local Linux User Group, a group of members have been actively participating in discussions towards a better direction for it. They have successfully hosted a small meet up and the minutes are available on the group’s discussion page at Google groups. Branching off from this and possibly parenting the movement might be a new entity, Maldives Open Source Society (MOSS) in the hope to establish, promote and advocate Open Source Software in general and it’s benefits.

I intend to summarize my thoughts and opinions on these and similar matters and other things of my interest. Possibly in much subtle methods than before in the hope to be honest and generous in my words and messages. A happy new year to you all and let 2009 be a period for prolific development and betterment of our lives and the community we live in. Cheers.