Patrick Quinn's Blog

A babbling brook of information and thoughts.

header photo

Report: OS.js; Future Perfect

January 9, 2013

As many of those who have read my previous articles will know, i am a big backer of the web, especially web technology on the desktop. However with the recent course corrections made by Facebook in terms of their mobile offering and the relatively poor performance of HTML5 applications  to date this belief had become marred with doubt. That was until i came across OS.js. 

 

OS.js started life as a humble hobby project by developer Anders Evenrud, as a way to manage configuration of his home server remotely using a GUI. It soon became apparent that it had the potential to be more. The result of this realisation was an SDK, IDE and Minimalistic, light-weight, multi-user web desktop all running on top of Linux as a complete, standalone OS. Its features include package, session and resource management, a virtual filesystem and support for Glade/Gtk+ based UI widgets (which means the Glade designer can be used to design and develop application UIs) as well as a plethora of default applications, demos and games. Best of all, OS.js is 100% free and open source and is released under the BSD license. 

 

OS.js desktop

So why exactly is a platform like this exciting from a technology standpoint? Surely this is all old hat? 

Well yes and no. Sure there are existing web applications which ape desktop environments, but these are all meant to be run solely as thin clients and are web based and distributed in nature. OS.js however, is targeted at the desktop itself with the ability to be run thin only being a side-effect ( and side benefit ) of the core technologies used . As the world shifts its focus more towards the web as a common platform for both content and communication, HTML5 begins to gain recognition as a first class citizen amongst users and developers alike at a rapidly growing pace. Microsoft has even made it the official language of Windows 8 with their HTML5 based ‘Silverlight’ framework and Canonical have announced that HTML5 is to be  a major part in their upcoming Ubuntu Phone OS strategy. 

 

On a more personal note, what is it about this project that has me agog? For me this project is the realisation of a dream i have had for several years now, a paradigm shift in how we perceive the desktop and desktop application development and a convergence between both standard desktop and cloud computing models.

Historically, several people have argued that my interest in the desktop is nothing more than futile noise, a relic from the halcyon days of IBM ( with others labelling me a hapless dreamer ). In reality however, it is the idea of a ‘platform’ itself which is becoming an irrelevant notion. The future of applications is heterogeneity, a ubiquitous delivery mechanism for all applications across all platforms. The holy grail, neigh, the unicorn of the computing world. Be it desktop, mobile or somewhere in between, as long as it can render a web page it will be capable of running any application. And this concept excites me greatly. 

 

OS.js applications

There are however a few remaining hurdles with this sort of technology. The biggest being performance. HTML5 applications have yet to really live up to their native counterparts when it comes to performance, especially on the lower end systems that proliferate the consumer market. This is, in part due to the high level of abstraction by which interpreted languages such as Javascript abide and to some extent the engines which perform this task. However things are beginning to change in HTML5’s favour with technologies like Google’s v8 and Node.js entering vogue. As the gap between native and web performance closes and the functionality and application of higher level associated concepts begin to appear in scripted languages, OS.js has never been more relevant than it is today and thankfully Anders has done a cracking job of providing a showcase for what can be done when web technologies are applied correctly and efficiently, with OS.js simply screaming along. 

 

For those who are interested in OS.js,  you can hop on over to  https://github.com/andersevenrud/OS.js  ( which i strongly recommend ) and check out the source code and class documentation or see an example of the technology in action http://osjs.0o.no . You can also keep up-to-date with Anders Evenrud on his personal blog over at http://anderse.wordpress.com/

 

OS.js mail application

 

Report: Just what is ’good design’? A Functionalist’s View

October 21, 2012

Admittedly, this could well end up being one of the most subjective and personal articles I ever write but design and the industrial process are something which I’ve always greatly appreciated and held dear. This is an article biased toward my own personal preferences (i.e Skeuomorphic vs Digital, Functionalism / Minimalism vs Naturalism, Color vs Monotone etc). It’s only in recent years that I’ve begun to understand the elements that combine to make products I hold on a pedestal exist and excel. 



Lately I’ve become fixated by the work of 1970s Functionalist design legend, Dieter Rams. His work at Braun was a ray of light in a decade filled with gaudy, tacky architecture and interior design and has become the inspiration for much of modern design, including most of Apple’s current offerings (Both Steve Jobs and Jonathan Ive have accredited the work of Dieter Rams as being the base point for their own). Rams’ 10 principles of ‘good design’ are the staple of any industrial designer worth their salt (even for those who choose to eschew them, they will always be an integral part of their trade). 

 

These are:

 

• Good design is innovative.

 

• Good design makes a product useful.

 

• Good design is aesthetic.

 

• Good design helps us to understand a product.

 

• Good design is unobtrusive.

 

• Good design is honest.

 

• Good design is durable.

 

• Good design is consequent to the last detail.

 

• Good design is concerned with the environment.

 

• Good design is as little design as possible.
 

Here we have an excellent template for anyone trying to create a functional yet beautiful product. However, good design doesn't stop there. Good design is also an emotional pursuit, subconscious and inherently intuitive. 

Let’s look at the first of these three extra categories- emotion. It’s not enough for a design to be perfect on paper. It needs to pass the ‘person test’. People are by-and-large driven by pathos. Thus it is important that when they use a product it immediately conjures up an emotional response. For example, when we hold a new mobile device for the first time we might notice that its’ weight, balance and finish are reminiscent of a 90’s VHS. Though lending the item a nostalgic effect, this would probably make for one hell of an ugly Smartphone. Therefore, it is important to factor in things like quality and finish of the materials we are using and their properties. While some may argue that this is solely a manufacturing matter, ensuring an item evokes pride and importance to its owner through the materials used, is key, allowing its owner to engage with it on an emotional level, making them feel. 
 

The second of these extra categories, the subconscious, is just as important as any of Rams’ 10 principles. The ability to have a user appreciate a product without being able to mark out conclusively what it is they specifically enjoy about it, is paramount. It’s what gives a product its magic. To achieve this, each of its design choices, accenting and features should compliment one another, melding into one unified and fulfilling yet functional experience, without being blatant or ostentatious. This, after all, is the essence of the magic of superior design. 

This brings us to my final point- the importance of intuitiveness. In the same way it’s not enough to make a design ‘paper perfect’, neither is it enough for a product to be clever. Good design should be pragmatic too. One of the hallmarks of functionalism is that each design choice has a specific purpose and is included so as to not ruin the minimalist ascetic- no superfluous nik-naks or do-das just to add a touch of flair. Flair is to be used sparingly and can be achieved through the correct use of contrast, accenting and color, not in adding a golden Trisquel just because you can. Good design is neither form over function nor the inverse, but rather the two working in harmony to create a memorial experience which can be both appreciated and enjoyed. 


General: A touch of contrast.

October 19, 2012

Deepest and sincerest apologies for my lack of content these last few months. Life and work got the better of me. I should have some fresh and  rather juicy  new articles for you tomorrow. In the mean time this makes for a great opportunity for something a little different, something sophisticated and not at all techie, the Arts. 


Here is an except from the link below and one of my personal favourites:


 'Seehaus im Sommer' 
by Ciarán Mangan 

Decorated garden,
Decked out in postcard perfection,
Bayern enshrined,
Time honoured.

 
Beer mugs dot the tabletops,
Stout glacial giants,
Standing tall and proud
Amongst the spotted raindrops,
Mirroring the glinting lake face,
Spilling irrepressible prisms.

 
Jugs filled with the centuries fermented,
Draped upon their noble brims
Dappled sunlight with
Perspex dances of dragonflies.

 
All life emanates from the heart- 
The perennial splendour of that chestnut tree,
And the sound of birds rolling above
Pours like helles from its canopy.
(Summer 2012) 

 
Ciarán is a good friend of mine and i feel his work needs to be seen by more bodies than just myself. I hope you will extend the same support that you have me over the past few years and enjoy the content linked, while it may not be technologically relevant it allows for a bit of context, something many of us (myself included) desperately need nower days.

I will check back in with the aforementioned new content soon, in the mean time relax, enjoy and don't forget to sound off in the comment section below if you read something you particularly like!

Cinders and Roots 

Report: Bitcoin. Open, decentralized and growing.

June 18, 2012

The term Bitcoin (BTC) has become a buzzword. A buzzword most recently associated with the euro, pyramid schemes and a proposed solution to the current banking crisis and economic turmoil. But what exactly is “Bitcoin” and is it something one should be investing in? 

What is Bitcoin:

Bitcoin is an open source, peer to peer, crypto-currency which relies on its users as the mint. Conceived by Satoshi Nakamoto and released in 2009 (0.1 alpha), Bitcoin employs a “proof of work” system which ensures security, legitimacy of transfer and generation and eliminates  double spending. Each user can have one or several Bitcoin addresses (similar to a bank account number) which are generated using an elliptic curve DSA. These addresses link to a digital wallet, stored locally on the users machine or to an online wallet known as an ‘instawallet’. Users can, using these addresses, transfer any amount of Bitcoin they wish to and from any wallet. These wallets are encrypted using the (currently still secure) SHA-256 encryption algorithm and as with a physical wallet, if lost, its contents are lost with it.

Mining: 

Bitcoins are generated by their users instead of a centralized mint or governing body. Blocks are mathematical problems generated by the Bitcoin network. Its users willing to give their computer resources, known as miners, are rewarded with the Bitcoin that solving that block results in (initially 50 BTC per block). There are a finite number of Blocks that can ever possibly be created (in and around 21 million) and as the current number of blocks generated increases, so too does the difficulty of each Block with the number of coins per block decreasing to match.

Because of this in order to keep generating at the same rate consistently, miners must add more and more resources to their mining efforts, hence the pyramid connotations with early adopters being rewarded more so than later ones. However those early adopters do not directly govern the profitability of those mining below them unlike traditional pyramid schemes.  This accounts for inflation as well as keeping Moore’s law at bay.

 

Originally miners could find profitability in using their CPUs to mine for coin, but as the difficulty of the Blocks increased the only viable method of mining became the use of GPU’s. GPUs have a much higher Megahash/s throughput than CPU and are relatively inexpensive to procure. Now however, while there is still some viability in GPU mining (although it requires multiple high end, OpenCL capable GPUs) it has a large footprint, both physical and economical.  

The big money today lies  in task built FPGAs or Field Programmable Gate Arrays which exist for the sole purpose of solving Bitcoin Blocks. They provide a high return on dollar spent and watt used per Megahash/s produced. 

The use of FPGA has even created an industry amongst users with specialist hardware skills who produce and sell their specialist devices to others in the Bitcoin community.

 

Due to the current difficulty (which is almost 210,000 times more than that of the first generated Block) miners tend to pool together to solve Blocks and share in the resulting coin generated. This is a much more consistent method of earning Bitcoin, however the payouts is significantly less. 

As with solo mining the more resources put into the pool by a user, the greater his return. Usually miners in a pool direct their miner application at that pools web address and port number, specifying the name and password of their worker and watch the coin roll in. 

A good example of this is http://mining.bitcoin.cz/ (which also support Namecoin, a Bitcoin alternative used for domain name registration)
 

The Market:

With the current collapse and impending fall through of the euro, more and more people are searching for alternatives which can act as some form of security net. 

Investors in charge of monetary funds are now beginning to dabble in Bitcoin as a method of growing their pile and due to this new found interest and the disillusion in the eurozone, the value of the Bitcoin has risen sharply in recent times. Over the last 30 days a single Bitcoin (BTC) has jumped from just north of 3 dollars to over 6 dollars a piece. 

While most Bitcoin users and market analysts alike view this as a good sign and one of recovery, many people are still very cautious of such a large spike in such a short period of time.  It would not be the first time the bubble has burst after all. 

To view the current market charts head over to BitcoinChats

 

The Risks:

 

In 2010 the Bitcoin weighed in at around 20 dollars a coin with some market analysts predicting that number rising to 100 or even 1000 dollars per coin. 

A market crash was inevitable, one which even the dogs on the street could see coming.

The smart sold their amassed coin, but many who where caught up in the upward trend did not. 

In June of that year the euro collapsed bringing the value of the Bitcoin right down with it. 

It reached a market low of almost a single dollar per coin leaving the BTC community in disarray and a lot of miners seriously worried about their investments.

 

To add insult to injury one of the major Bitcoin exchanges Mt.Gox had an intruder access its servers using compromised moderator login credentials and transfer and sell almost 500,000 dollars worth of Bicoin overnight. This left the value of the Bitcoin at nearly zero. Thankfully Mt.Gox where able to reverse the transactions before it was too late, bringing the value of BTC back up to nearly its worth prior to the the attack. Trader confidence however was left irreparably damaged and many saw this as the death knell of the Bitcoin. Since those dark days the Bitcoin has seen growth. Not major growth, gradual, solid growth all the same. 

 

Legality & public reception:

 

Due to its transfers being almost completely anonymous (almost being the key word there) the Bitcoin quickly became the medium of choice for those looking to trade weapons, drugs and other illicit goods on the net. 

Those in government deeply worried by decentralized nature of BTC used this as a means to illegalize it. 

However to ensure this did not come to pass the developers behind the Bitcoin client have worked closely with the various branches of law enforcement across the world to ensure the systems is not abused by those looking to commit nefarious activities and that Bitcoin is reachable by the long arm of the law.

 

The public on the other hand have had a very mixed reaction to this so called crypto-currency. 

As more begin to take the currency as serious and viable alternative to well established, traditional ones such as the euro and the dollar the legitimacy of the Bitcoin is further compounded. 

But there are still those who deem Bitcoin to be a ponzi scheme and at essence a total crock, with no real value proposition (In the way you would use money to gain services in place of a barter system). However, when any Bitcoin advocate draws parallels between Bitcoin and any existing , mainstream currency it becomes painfully obvious that the main reason for their hostility is  not the legitimacy of the model involved but how much it makes them question their own blind faith in the current methods of trading (be that credit, euro, pound or dollar) that they use.

 

Also, more and more small industries such as resturants and online foundations alike are starting to accept Bitcoin as a method of donation and as a recognized currency for online transactions. 

While i wouldn't advice putting all your eggs in the BTC basket, i would suggest you at least look into giving Bitcoin a try as a method of payment if you are an interested party in industry. 

 

 

Alternatives:

While many Bitcoin alternatives exist (such as Namecoin, which i mentioned above) the one to watch in my opinion is the Litecoin(LTC). 

Litcoin is a BTC clone which is essentially silver to Bitcoin’s gold. Its has a current market value which is a fraction of that of Bitcoin’s, but it is also at very early stage in its lifecycle. This means that its difficulty is significantly lower than that of Bitcoin and due to its GPU resistant nature it can be mined on the CPU in parallel with Bitcoin without either mining process being affected. 

 

Litecoin uses Scrypt as a proof-of-work scheme (similar to Bitcoin). Scrypt adds memory-intensive algorithms to reduce the efficiency of GPUs down to the level of CPUs. This is good in the short term, but is almost certainly harmful if Litecoin were to gain wide-scale adoption. While the current market value is low it may one day be half of Bitcoin’s current value or higher but as with Bitcoin early adopters are rewarded, hence now is the time to get mining Litecoin (LTC) as its value may increase dramatically along side its older brother in the coming year.

Other crypto-currencies include SolidCoin, LiquidCoin and the aforementioned, Name-coin.

 

The Future:

While the Bitcoin may or may not be the one world currency in 10 years time that many hope it will. its lightly to pave the way for future e-currencies which employ a similar methodology. Either way, Bitcoin is one to watch in 2012. 

 

For more information on Bitcoin visit their site ( http://bitcoin.org/ ) or their wiki ( http://en.wikipedia.org/wiki/Bitcoin ). The source code for the Bitcoin Wallet client is also available via its git hub page ( https://github.com/bitcoin/bitcoin )

I hope this article has been educational and cleared up some of the basic points associated with this radical new currency. 

 

The coin is dead, long live the coin.

Report: The Mac family, whats next?

April 30, 2012

With the June Apple announcement at WWDC 2012 fast approaching, many are speculating the introduction of the next generation of MacBook Pro. The current crop of desktop computers which Apple offer are probably the best in their respective classes (personal opinion) but after 5 years or more without a major refresh, they are past due an overhaul. But what changes could apple possibly make to their next-gen MacBook lineup? 

MacBook Pro concept

Apple claim that each of their products is 'revolutionary' in some way or another and while most are, a few definitely fit that description better than others. One of which is the MacBook Air, originally introduced in 2008, this device had consumers and designers alike salivating at its Über-slim form profile and sleek curves, but its price and lack of grunt (amongst a few other niggles, like battery life for instance) put a serious dampener on its overall popularity.
Fast forward to 2010 and Apple once again tried their hand with the Air re-introducting it in 2 screen sizes, 11 and 13 inch longer battery life, higher resolution displays,  SSD on chip and i5 and i7 processors respectively.
This move sparked a backlash in the industry with everyone trying to mimic (none successfully) even spawning an entire sub-genre  in the PC market, the intel subsidized ultrabook initiative.
When compared to the Air, the  13inch Pro model looks mighty beefy indeed. Maybe it's time it was put it on a diet and introduce a sleeker, more refined profile.

Another Apple innovation which caused a stir upon arrival was the introduction of Apple’s Retina display technology (courtesy of Sharp) with the iPhone 4 making individual pixels virtually indistinguishable to the human eye (hence Retina) and proving the highest screen screen density of any mobile device on the market, at that time. 

The current  MacBook Pro display isn't bad by anyones definition of the word, but after using an iPhone any significant amount of time you begin to notice the comparatively low quality and pixel density. If apple where to bump their 13inch offering up to 1080p i would be happy, however in order to truly ‘wow’ the masses, they are going to need to go big, go retina or go home.

The HDD (or Hard Drive Disk) has sat atop the pile when it comes to storage for many years but with the widespread adoption of the, faster, quieter and more energy efficient SSD (Solid State Disk) magnetic hard drive technology is beginning to look more and more like a relic from those bygone days when moving parts where all the rage. Now with the decline of manufacturing costs high capacity solid state memory looks more viable as a first class citizen in the MacBook’s hardware lineup in stead of that of an optional extra.

Noise pollution generated by the whirling of magnetic storage disk isn't something most would  consider a huge problem but once you have gone solid state and bared witness to the difference first hand it becomes evident that the hard-drive's days are numbered and that number is fast approaching. Which takes me to my next point.

 

Seeing as we are removing moving components here, why not go whole hog?

I have been the happy owner of a mid-2011 MacBook Air for around 2 months now and in that time i have not once missed the use of the DVD disc drive that i had on my previous machine (A MacBook Pro as it where). I had often thought about replacing the Disc drive in my Pro with a secondary HDD or SSD while the Pro was my primary machine and using the extracted Super Drive externally if ever it was needed.

 

Every iteration of Apple’s MacBook has inevitably brought with it a spec bump in terms of graphical prowess. But what would it take to really up the ante? 

Nvidia’s latest family of graphics processors (cutely named Kepler) have demonstrated remarkable performance to power ratio.

A mid range Kepler GPU would be more than enough to push the MacBook Pro into some serious graphically intensive games and applications without breaking too much of a sweat or turning it into a lifeless brick in under and hour. Regardless of what Apple decide to do here, it will be interesting to see what they have up their selves.

Now, I love the look and feel of the aluminum/magnesium composite which makes up the MacBook line’s unibody shell, but, good jumping jesus, does it dent with the mildest knock.

A more durable composite ( Perhaps replacing that ultra soft with something equally light yet more dense, like titanium would be a start) would be a *very* welcome addition. Maybe then i can stop treating my Apple gear like newborn children and get on with using them, eh Apple? 

With the retirement of the black and white, non-Pro MacBooks, Apple left a distinct lack of options when it comes to individuality. Deep black, Cinnamon red and Titanium gold would be great, anodized alternatives to that brushed metal which is currently on offer. When the poly-carbonate (fancy plastic to you and me) MacBooks where removed from apples product line, prospective Mac-cultists where left without a slice of the apple pie unless they were willing to fork out over a grand for the up-range Pro models. You would think with the world in economic ruin, it would be a good time time to see the re-introduction of the poly-carbonate MacBook as a low cost option. Almost a gateway drug into Cult of Mac, if you will.

 Macbook Concept

Since the switch away from the use of the PowerPC architecture, Apple have relied on intel to provide them with their desktop computing power. As un-lightly it may be, Apple could consider supplanting the current x86 with Apple’s own ARM (http://www.arm.com/) based ‘A.x’  family of processor akin to those found in their range of mobile devices. This would serve to reduce production costs, reduce weight, heat reduction (potentially allowing for the removal of the final moving part in the MacBook, the fan) and allow further freedom from external partnerships with the often manipulative and underhanded intel. But a change as big as this would undoubtably spark, yet another upheaval amongst developers who have only just come to terms with the switch to x86 in the first place.  

It is possible that none of the aforementioned will see the light of day with the next generation of Apple’s iconic Mac family, but i have a gut feeling that even if they don’t the next MacBook will be, truly revolutionary.

Report: Linux hardware support.

April 28, 2012

Linux has come so far since its initial release in 1991. In fact it beat all the odds to become  the first commercially viable open source platform. The fair of hackers, computer enthusiasts and Enterprise alike there is a lot of love across the board for the little Unix clone that could.
However, Linux, specifically Linux on the desktop, has had its share of problems that has held it back from mainstream dominance.  
Ignoring the competition offered by the likes of Microsoft and Apple , which of these problems is the most serious and how could said problem be resolved?

image depicting Linux hardware

I'm referring to Linux hardware compatibility. This is a tricky and rather touchy subject amongst the diehard Linux community and one with no simple solution. 

Many new users to Linux face a huge wall that they must climb when they first install a Linux distribution and thats that a lot of the hardware on their machine either doesn't work properly or doesn't work at all. While this problem is nowhere near as bad as it was, even as recently as 2006 when I started using Linux as my full time OS,  it's still a serious issue.

An example was found when Ubuntu was trending on Google+ earlier this week, as mentioned by me in my Ubuntu 12.04 sneak peak. A Google+ member had posted about his new install of Linux, looking for some help with problems he was having. While he loved the desktop and so on (In this it was Ubuntu’s Unity, told you its not a bad entry point for new users) around 25-30 percent of his main hardware components which are essential for normal usage where MIA. Sound, hardware accelerated graphics, wireless support etc flat out didn't work.

After an hour of running the gauntlet with the aid of around 20 or so Linux vets he managed to get various problems sorted, thankfully he enjoyed the challenge of getting it working but he was more tech inclined than your average user. Should it really be like that?

A mainstream Linux distribution should be a ‘point 'n' shoot’ affair when it comes to install and set up and not like a Rube Goldberg machine. On windows hardware is set up using an auto installation wizard which detects and installs the appropriate driver for any outstanding hardware components. While this might be an option for Linux due to licensing restrictions such a system would be forced into using open source drivers as the those from the actual hardware vendors are proprietary.

Mac OS X doesn't need to worry about this issue as it uses a strict subset of hardware across its product range with the exception of printers which use a similar wizard system to Windows. The Mac OS X approach underlines a root issue and highlights one possible, but (extraordinarily) extreme solution.

 

That issue wether or not generic, all encompassing, operating systems are really viable any more or were they ever really viable to begin with? Would it not make more sense for the major distribution vendors which as Novell, Red Hat and Canonical to works closely with the hardware vendors. They could possibly contract them, use a third party contractors or even work in house to tailor a given distribution to specific products ensuring every aspect of their machine works and hence can be supported properly. 

This would remove the need for disto vendors to support generic versions of their product and the need for them to include support for every device under the sun. This approach would also remove the problems associated with licensing as the product vendors could deal with individual licensing themselves. Thus communities that have sprung up around specific products could then be charged with bringing hardware-specific, from source versions of distributions to that product (much like we have with the current smartphone paradigm, xda etc). Smaller distribution vendors cold then be left to continue filling the niche they represent. Hopefully this will also weed out the distributions that don't fill any niche whatsoever or offer any-more than a theme and a few preinstalled applications on top of the distros they are built on that clog up what little mind share linux has.

This will not be a popular idea amongst Linux-ites. I am aware of that. Its represents core change and they do not like core change especially change of this magnitude, but come on, trying to support everything you end up supporting nothing and everybody loses. 

Some feel that Linux can see mainstream success on the desktop by carrying on how it is and every year seems to be “the year of Linux desktop” but i have always been a big advocate of change and unless one as big as the one mentioned above is carried out the wonderful notion of Linux on the desktop as a mainstream phenomenon  will be forever relegated to the pages of history alongside BeOS and OS/2. Superior technology that just didn't see success. 

You may not agree with my point of view, hell you might even hate me for it, but one thing we can all agree on is that it would be a monumental shame for Linux to fail when its come this far. 

News: Steam for Linux? I'm excited..

April 25, 2012

It is almost tradition at this stage. A rumor surfaces at some point or another during the year, usually around the same time, that

Steam LogoValve are looking to bring their ubiquitous gaming platform to Linux. Usually these rumors either fizzle out or are flat out denied and a plethora of excuses about poor market share, poor graphics performance and so on are given but when the inevitable (yet slightly late this year) rumor of Steam coming to Linux surfaced earlier today the stage was set slightly differently.


The Humble indie bundle has seen huge success selling their suite of games cross platform making the most money per-head on Linux and the Desura Steam competitor has seen release on Linux also. Other engines like the one naval strategy game, Oil rush uses, Unigine are rolling out by the week and things have never looked better for Linux in terms of  Gaming .


Recently Linux hardware blog www.phoronix.com announced they would be meeting with Gabe Newell to discuss a Steam for Linux and assuming they did not come across and NDAs or  brick walls they would have more information about the client. In fact many of the rumors of Steam and the Steam engine being ported to Linux over the years have come from Phoronix so this was a chance for them to prove themselves right, and it would appear they have.
Michael Larabel flew out to Steam's Washington offices and talked to them about the client , if it was in-fact in development (which it is) why it has taken so long (which it really has)  and whether or not they plan on bringing steam to Linux after all. In short this is what Michael tweeted after leaving the offices " does have Linux games coming plus other very positive Linux plans... I'll briefly post some screenshots and such tonight." and that says it all, this could de-reail one of the biggest issues Windows users have had with Linux and why they refuse to make the switch, they cant play their favorite Steam games.
I guess with Steam for Mac OS X and an ever growing Linux gaming landscape, Steam's entrance into the Linux gaming market was inevitable. Below you can see the Steam engine running Left 4 Dead 2 natively on Linux (Almost brings a tear to your eye right? well almost). 

L4D2 running on Linux

So what does this all mean? It means that the graphics OEMs such as Nvidia and AMD will have to start taking their Linux drivers far more seriously and make far more than the mickey mouse effort they have up to now. It means the people behind the open source alternatives to the proprietary graphics drivers are going to have to work that little bit harder to get nouveau and the like up to scratch and it  also means ( assuming this all pans out) that Linux is well on the way to finally becoming the first class citizen on the desktop. 

To read the full details of Michael's interview with Gabe head over Phoronix and check it out. 

Image courtesy of Phoronix  

Review: 12.04, precisely what was needed.

April 24, 2012

 

It's an exciting time of year for Linux fans. A time when Canonical gets ready to roll out a new Long Term Service release (or LTS) of their Linux based desktop operating system, Ubuntu.

Precise Pangolin wallpaper

 

Yesterday morning I did as I always do first thing and logged into Google Plus. Much to my surprise in the ‘trending’ section was #Ubuntu. I scanned down through the feed and I have to say the response and support Ubuntu was getting was astonishing. I wanted to contribute in some way so i decided id write up a review of 12.04 codename Precise Pangolin, in its current state before launch.

 

So how does this release fair against the last few? A lot of subtle enhancements and some big ones.

With the release of 11.04 came the introduction of the Ubuntu netbook remix interface (which was released with version 10.04) aptly dubbed Unity as the default desktop environment. This was a move which garnered a huge amount of criticism as it was a radical departure from what came before (Gnome 2, which at stage was becoming antiquated) and due to this was disliked by a great many people, mainly due to the fact that it was not fully understood and it was felt that it affected productivity, but also as it wasn't feature complete. It's design was clunky and was a massive resource hog.

Thankfully however, with the next iteration (11.10) came a far more streamlined Unity which filled in many of the features missing from the prior release. Now with the release of 12.04 we see a much more polished beast. 

 

So whats new in 12.04? A lot.
The OS is now based on the 3.2 Linux kernel, which has been rebranded as the Ubuntu kernel. 
 

Firefox 11The default browser, which is still Firefox (and not Chromium as I had hoped) has been updated to the latest stable release, version 11. 

Historically Rhythmbox has been the default music player in Ubuntu but from 11.04 to 11.10 it was replaced by Banshee. Banshee however was based on the .NET clone Mono and since Ubuntu are no longer ships (the rather large) Mono libraries with their releases they have reverted to using Rhythmbox as default, this time at version 2.96.

The two stock themes, Ambience and the lighter Radiance have been overhauled keeping them more in line with Gnome 3's default theme, Adwaita and a tweaked version of the purple light-patterned background
has been included along side a new Pangolin themed one (It has been tradition to include to include wallpapers which depict the animals which the releases are codenamed after). 

Unity also got bumped up to version 5.10 bringing a whole host of goodies including the removal of the oversized novelty icons which resided in Dash (the main menu) favorites section, replacing them with search lenses for files and folders, video, music and applications as well as a global option for wide-spread searches.

Ubuntu lenses showing the video Lense This update also intorduced a new progress animation in the launcher bar for the installation of apps from the software center (which now include the ability to buy and sell more content than before) and so on.
The icons that reside in the launcher can be resized to allow for more or less screen real-estate usage and it now has better support for multiple-monitors or “dual head” mode. 

Another big update in 5.10 is the inclusion of the HUD or heads-up-display which acts to supplement (and will eventually totally replace) application window menus. 


Ubuntu HUDOtherwise, there has been various version updates for the built in applications such as LibreOffice, Ubuntu one (which got a rather nice UI overhaul)  Thunderbird mail client, alongside the removal of the advanced synaptic package manager and the inclusion of the Landscape management service which allows administrators to monitor, manage and update an entire network of Ubuntu computers all from a single point. Thats a lot of good stuff right there. 

My personal take on the release is very positive and while it doesn't sound like they have done anything too groundbreaking in 12.04, with each release the focus on the user becomes more and more evident, culminating with this, Ubuntu’s Unity finally stating to leave that dark tunnel and enter into the warm light of general acceptance.
This release, even early on has felt solid, smooth and WAY more user centric than not only any other version of Ubuntu before it but any other flavor of Linux i have used to date. In fact it ran marginally better (i.e quicker) than Mac OS X 10.7 on the iMac that was being tested on, but Shhhh dont tell anyone.

In my opinion all of the above is what makes Ubuntu a fantastic poster boy for Linux on the desktop. 
I wish canonical the greatest of successes with the release of 12.04 Precise Pangolin in a few days time. I think they might be on to a winner with this one. 

Also on a side note, if you are on the fence on wether or not to give Ubuntu a try  and you are a Lady GaGa fan, the Queen of bizare herself uses Ubuntu as her full time system of choice, so now you have no reason not to.

The latest version (Currently Beta 3) is available here with more up-to-date daily builds (which i strongly recommend to get an as near to final experience as possible) available here   


The next version of Ubuntu is coming soon

PSA: Keeping up with the blog.

April 23, 2012

Social media is a great way to share and communicate with friends, its also a great way to learn of the latest articles by sites you follow. Some of you may wish to follow the site and myself to get the latest editorials, news articles and reports that i publish. So here are the places that i can be found. 


RSS

Twitter:  @patrickjquinn 

Google+:  +Patrick James Quinn  



And thats about it!

All articles will be published to both as soon as they are published.

See you all soon hopefully :)  

Patrick out.

Report: The alternative OS, my top 5

April 23, 2012

I have always had a fascination with the idea of alternative operating systems and over the years I have come across and played with (even developed applications for) a variety of them. However there has always  been a few I've followed  closely which have stood out, head and shoulders above the crowd. I have compiled a short list and description of what in my opinion are the top 5 best alternative, open source operating systems out there.

 

Number 5: AROS Icaros 1.4.1

AROS, formerly known as Amiga research operating system, since changed to the recursive acronym AROS Research Operating System to avoid trademarks troubles, was started in 1995 by the small AROS

development team and had its initial release in 1997.The Icaros desktop Its goals where to create a modern Amiga OS clone for personal computers (PCs to you and me) and over the years it has largely succeeded in doing so. The platform is released under the GPL and has seen several spin off distributions including Broadway, the Acer Aspire One (one of which I own) centric AspireOS and my personal preference, Icaros Desktop which includes 3D accelerated graphics for Nvidia cards and many 3rd party applications.

If you want to know more about AROS or try it for yourself, you can find it all via their blog (http://vmwaros.blogspot.com/). So thats AROS, our number 5.

 

Number 4: MenuetOS M64 0.98.51

MenuetOS is a real time operating system with preemptive scheduling developed by Ville M. Turjanmaa.

It saw its first release in 2000 and is licensed under the GPL. The MenuetOS desktop

Menuet features a fully fledged graphical desktop environment which includes basic applications for general use, graphics drivers and a monolithic kernel all (astonishingly) written in assembly language (FASM to be precise) which is no small feat. It runs on fat32, supports multi core processing and comes in 32bit and 64bit variants. It can also play Quake. Need I say more?
If you want to learn more about this sub 2 megabyte wonder you can find it at their project site (http://www.menuetos.net/). Again thats MenuetOS, our number 4.

 

 

Number 3: Syllable OS 0.6.7

After the AtheOS project stagnated and died a small team of developers decided they would fork its source and Syllable was born.
Syllable is a much more modern take than the previous two entries with a native, WebKit based web-browser (the same engine you would find under the hoods of Chrome and Safari) called Webster, 

Syllable OS desktop

an email client called Whisper, media player, spatial file manager, fully featured Graphical desktop environment built on its own GUI toolkit and much much more. It also has its own IDE for 3rd party application development , a scripting language called REBOL, is POSIX compliant and supports preemptive scheduling, SMP and its own 64bit, journaled file system called AFS (Atheos File System). It also provides good hardware support with drivers for a large number of modern devices. Syllable is yet another GPL'd licensed OS and If you wish to test drive Syllable or grab the source code for yourself, you can head over to their main page (http://web.syllable.org/pages/index.html), grab a copy of either the desktop or server LiveCD image and get tinkering. At number 3, SyllableOS.

 

 

Number 2: ReactOS 0.3.14

Our number 2 is a small project with big aspirations. The GPL’d ReactOS was started in 1997 as a spin off of a former attempt (FreeWin) to clone Windows 95.
Its aims where to build a modern operating system with fully binary compatibility with Windows 2000/XP using Wine, Which stands for (yet another recursive acronym here) Wine is not an emulator. After various

ReactOS desktop

incremental updates over the years now supports a slew of Windows applications, games and drivers, and does so quite well. It also has its own Shell32 clone which as of the current release, supports theming.  

Controversy arose in 2006 however, when an accusation was made that dis-assembled assembly code taken directly from Window had been used for various parts of ReactOS. After a lengthily code audit this was proven not to be the case (fair play to them for not succumbing to that temptation).
ReactOS boasts the proud achievement of being able to play a variety of Direct X 9 games such as Unreal tournament, Quake 1-3 and Halo 1 and as the releases roll on applications and games are added to that list. For more, head over to their home page (http://www.reactos.org/en/index.html) and dig through their documentation and application test results database. The venerable Windows clone, ReactOS at number 2.

 

 

Number 1: Haiku OS R1 alpha 3

Some of you may be old enough to remember BeOS, the software platform which began life in 1991 from the now defunct Be Inc. BeOS initially ran exclusively on the companies incredible dual AT&T Hobbit (as well as 3 AT&T 9308S DSPs) powered BeBox but was eventually brought to PowePC and x86 architectures.
After the untimely demise of Be Inc. in 2001 an effort began in 2003 to create an open source clone of BeOS, carrying on in the legacy of its former parent. OpenBeOS as it was then known, was based on the NewOS kernel written by former Be Inc. engineer Travis Geiselbrecht and was released under MIT license by the newly formed, 

Haiku OS desktopnon-profit organization Haiku Inc. However after a trademark infringement notification from the then owners of Be Inc, Palm (which are now themselves dead and gone) in 2004 the platforms name was changed to Haiku OS. Haiku has an interface model which supports interaction by third party applications developers to interface with key components in the system for gaming, input, audio and so on. It also has its own filesystem (OpenBeFS) OpenGL, and support for rudimentary hardware acceleration as well as a beautifully functional, wing-tabbed desktop environment . Haiku also has slew of modern desktop orientated applications like a webkit based browser, dubbed Web+ (or Web positive after the BeOS browser Net positive), VLC support, graphical manipulation and so on. The OS adopts the principles and ethos of BeOS has done a great job in carrying on its legacy. 

In fact i  may have to do an article soon on BeOS and its advanced features such as protected memory which are only now seeing use in modern operating systems. Haiku is sadly only available for 32bit and comes in a few varieties which are compiled both using the older GCC 2 to maintain backward compatibility with BeOS and with the more modern GCC 4. If you are interested in getting your mitts on Haiku then you can either download the current “stable” release, R1 alpha 3 from their main page (http://haiku-os.org) or head over to haiku files (http://www.haiku-files.org/) which i strongly suggest as much has improved since the last public release. So at our top spot i give you Haiku OS.

 

So that wraps up my top 5 list of alternative operating systems, I hope you have enjoyed reading it as I did compiling and I hope you enjoy exploring them as I have. So please feel free to show your support to the projects above,  With all of the work they have done over the years in the name of computing, god knows they deserve it.


 

 

View older posts »