Top Cloud Practitioners Come to Europe to Talk Big Data

GigaOM is having its first major European conference on the 16th and 17th of October in Amsterdam. With the title “Structure:Europe“, the conference features many C-level executives from top technology companies with a focus on big data, internet infrastructures and the cloud. Some of the technologists and business leaders that will be speaking at Structure:Europe include: JP Rangaswami, Chief Scientist with; Bernard Dalle, General Partner in Index Ventures; Bob Jones, Head of CERN openlab; Dan Levin, COO of Box; Tony Lucas, Founder of Flexiant; Amr Awadallah, Founder of Cloudera; and Michelle Munson, CEO of Aspera. We also have a 40% discount on registration for our readers.

Whether we realise it or not, many of us have become accustomed to working with the cloud, whether it be accessing movies, music and other forms of entertainment online or co-authoring and sharing documents using Google Drive, Box and Office 365. In cloud computing, computing power and storage is leveraged from a cloud of machines that may be physically removed from a local user or application. The distribution of processing power and data requirements across very many machines allows more powerful computing tasks to be carried out than a single local computing device could manage (see “Cloud Computing: From Capacity To Capability“).

Structure:Europe will look at emerging technologies and issues in relation to cloud computing and the corresponding growth in big data. According to Alison Murdock from GigaOM: “We are bringing the best cloud practitioners, from the US and Europe, together to talk about big data, APIs, federated clouds. We’ll have the head engineer of Facebook, Marten Mickos of Eucalyptus, Werner Vogels of Amazon, and lots more.”

One of the companies that will feature at Structure:Europe is DataSift, a social data insights platform from the UK. Founder Nick Halstead will be a speaker at the conference. We previously interviewed DataSift’s Sarah Blow about the filtering technology underpinning DataSift that allows for highly-refined searches to be able to take place across a selection of social networking platforms.

Another of the speakers is Scott Dietzen, CEO of Pure Storage, a provider of flash data storage solutions for enterprise. Five years ago, we spoke to Scott about software-as-a-service (SaaS) and the cloud when he was President at Zimbra. Now, Scott’s focus is on speeding up access to the data used by cloud services in data centres. A 10x increase in speeds is possible by using flash drives based on SSD (solid state drive) technology, as well associated space and energy savings.

Expertise in cloud computing has grown significantly in Ireland over the past few years. Zartis provides recruitment solutions powered by the cloud. JLizard’s Logentries is a cloud-based system that collects and analyses log entries from large software systems. Northern Ireland company AirPOS provides vendors with point-of-sale solutions from the cloud. FeedHenry spoke to us earlier this year about their cloud-based mobile application system. Inishtech’s cloud-based platform for intellectual property protection serves over 100 clients. Cork Institute of Technology launched two new cloud computing courses in 2011 and some graduate conversion courses in 2012. And US company Marketo also announced significant job hiring for its Dublin-based cloud operations recently.

Big data and its impact on internet infrastructures is ‘big’ here too, especially as we all struggle to deal with the information overload associated with increasing amounts of data. Polecat’s MeaningMine aims to provide meaningful answers from big data. Linked data is a means to relate disconnected datasets to each other, and has been proposed as a new layer for the Internet by DERI (the world’s largest Semantic Web research institute). Also from DERI, Sindice have created a semantic search facility based on data-as-a-service (DaaS) through a cloud-scalable infrastructure that ingests 100 million semantic documents per day.

The word cloud on the right, built from the Structure:Europe schedule, gives you an idea of some of the topics to be covered. The main focus is on big data; innovation in infrastructure; cloud adoption across Europe; regulatory concerns – privacy and policy; IaaS and PaaS; public cloud vs. private cloud; and venture capital investment in cloud.

If you’re a player in the cloud computing and big data space in Ireland or elsewhere, Structure:Europe should be an essential event.

Readers of Technology Voice can use this link to save 40% on registration – – or just use promotional code “TECHVO40OFF” at registration. For further questions, please email the GigaOM team.

Can Your Mobile Phone Help You Get Fit?

Ted Vickey (on right) with John Breslin and researchers from NUI Galway.

Can your mobile phone help you get fit? A researcher at the National University of Ireland, Galway (NUI Galway) and former White House fitness expert will pose this question at the 5th Annual Medicine 2.0 Congress which opens in Harvard Medical School, Boston, tomorrow.

Ted Vickey is a PhD researcher at the Digital Enterprise Research Institute (DERI) and the Discipline of Electrical & Electronic Engineering at NUI Galway. His company FitWell won the White House Athletic Center contract in 1995. At Medicine 2.0, Ted’s presentation to delegates will show that “understanding one’s social network may be one key to better health”.

“Rather than surfing in the ocean, we are surfing the web. Rather than an outdoor game of tennis under the sun, we are inside our homes playing online virtual tennis on our Wii. People drive their cars to the gym and then take the escalator to the front door rather than walking and taking the stairs,” explains Vickey. “But what if technology could be the solution to our problem? What if our mobile phones could track our every step, provide healthy tips during the day, even persuade or motivate us when we need it most? This dream is now a reality all across the globe and it is called Mobile Health.”

There are an estimated 13,000 health-related apps in the iTunes app store: everything from monitoring blood pressure to tweeting body weight to tracking sleep cycles. A subset of these are fitness-related apps (MapMyFitness, Nike+, etc.) for monitoring and reporting on a person’s exercise characteristics. One way to share some of this exercise activity data is through microblogging services such as Twitter.

Various studies have indicated that “lack of motivation” is a key factor in why a person does not exercise. With social sharing of exercise activities using mobile fitness apps becoming more common, understanding and leveraging one’s social network may be one key to better health through exercise. However, the effectiveness of online sharing via social networks of one’s physical activity has yet to be fully understood. More research and best practices are therefore needed to show how advanced social web technologies may effectively address the lack of motivation excuse, and thus increase exercise adherence/general health.

As part of his PhD research, Vickey and his colleagues at NUI Galway have collected over 4.5 million tweets sent via mobile fitness applications from around the world. These were then categorised into different classifications, in an attempt to understand the correlations between online social networking and effective exercise motivation and adherence. For each person who shared a workout online, the researchers looked at their social network structure and their online influence, while determining a fitness classification, exercise intensity, exercise duration and motivation for that person.

“Mobile fitness apps not only allow for the sharing of information between user and healthcare providers, but also with a user’s friends. These self-monitoring units will help change the face of healthcare around the globe”, said Vickey.

Vickey’s paper, ‘Estimating the Long Term Effectiveness of Mobile Fitness Apps and Exercise Motivation’, has been shortlisted for the iMedicalApps Medicine 2.0 mHealth Research Award. His research at NUI Galway is funded by the Irish Research Council in conjunction with the American Council on Exercise (ACE Fitness), the largest non-profit fitness certification organisation in the world with over 50,000 professionals, and by Science Foundation Ireland. Vickey also serves on the Board of Directors of ACE Fitness.

Established in 2003 by NUI Galway and Science Foundation Ireland, DERI has now grown to become the world’s largest semantic web research institute. It engages with companies, from startups through to multinationals, to develop new web solutions. The Discipline of Electrical & Electronic Engineering at NUI Galway also offers a degree programme in Sports and Exercise Engineering, focusing on the convergence between electronic systems and exercise.

Cloudbusting in Galway, Ireland

If you have ever wondered why cloud computing is so important and are curious about what practical applications the technology can be used for then a trip to Ireland’s West Coast could supply you with a great many answers – many of which should surprise and delight.

Cloudbusting is the name that Damien Joyce and his colleagues have given to the free event that takes place at the Galway – Mayo Institute of Technology (GMIT) on Friday, September 14.

After three months of sustained preparation, those who choose to attend will be met with a panoply of talks centred around six conversational streams. As Damien straightforwardly points out, “If you are planning on working in IT in the next five years and you don’t find something of interest to you then I think you should reassess your decision.”

A sign of a successful day for Damien would be one, “Where people were struggling to decide to which talk to go to.”

Leading out the vast and varied array of speakers is David Allen, Adjunct Lecturer, Digital Brand Strategy at University of Oregon and founding member of the band, Gang of Four. According to Damien, David is likely to be, “Very vocal on the future of music technology and also digital and social media.”

Apart from being a former GMIT student (he studied at the Castlebar campus) Damien has a very specific reason for choosing the college as a site for the event rather than some anonymous hotel conference room, “It’s about technology, it’s about the future, it’s about people learning.”

Referring specifically to the conference he says, “It’s about demystifying the cloud. We all talk about the cloud, everyone talks about the cloud. I am trying to get across to people that the cloud provides opportunity.”

He cites the example of Domino’s Pizza in the States: “364 days a year they don’t really care about how many servers they have but they don’t want to pay for anything additional that they are not using. The one day of the year they are very interested is the day after Thanksgiving when no one wants to cook and are ordering in pizza. So Domino’s want to spin up as many virtual machines as they can to take the business. They are not a technical company but they are using cloud computing as support.

“It’s the same for government, we have speakers like Tim Willoughby and Gar Mac Croista who will talk about what the various agencies are going to do with the cloud and how they are going to do it. This also leads into open data and that leads to other opportunities.”

Damien says that his own personal reasons for taking this project on were about his desire to, “Promote a culture of openness around this sort of engineering. I want people to be interested in technology and learn about technology.

“I’d love it if someone came along to one of these talks and had a light-bulb moment, “I want to learn a bit more about that.” Or, “I want to go into that area.”

Damien regards the cross-fertilization of ideas and experiences as very important as, “A lot of people work in silos.”

To counter this isolation and its inherent restrictions on communication and growth there has emerged in recent years concepts such as devops. This particular approach to collaborative and coordinated working was engendered by the needs of the early cloud start-up companies in combination with the newly prevailing development methodologies such as Agile.

Events like Cloudbusting are designed to expand this theme of collaboration and spread the culture of sharing.

One of the most fun things with learning something new is to share your ideas and knowledge with others. Fortunately, Galway City provides a plethora of opportunities for getting together with like-minded individuals and socializing. Damien has selected Kelly’s Bar as the location for attendees to gather after the event to relax and chat.

In addition you can check out live performances from Sive, Twin Terrace and ImYourVinyl. Also, Sive have kindly agreed to play a few acoustic tracks at lunchtime in GMIT.

As the frontiers of cloud computing and expand at ever increasing rates it is no longer possible for any one person (that’s sound in the head) to declare themselves the ‘expert’ in this area or to be considered as such. The moment when that was possible has long passed.

All you can really do is get stuck into the aspects of cloud computing applications and technology that makes your heart sing in the knowledge that there exists other explorers of this new frontier beside you and maybe just a little ahead of you whose knowledge and experience you can call on.

But beyond that, events like Cloudbusting offer the most fun thing of all for a person of curiosity – serendipitous discovery. An idle conversation in a crowded hallway, a talk title that doesn’t make sense but makes you want to find out more. A connection made with a somebody that may not have happened in any other circumstance. These chance occurrences are the lifeblood of progress.

It is at events like these that the possibility exists of hearing the right sort of brilliant and intriguing remark that can literally open up whole new worlds of opportunity and possibility.

Considering the quality of the speakers and the attendees, the sheer range of learning and expertise on offer and the opportunity over an entire day and evening of conversation and shared learning then Cloudbusting with its mix of appropriate ingredients has the critical mass to produce the possibility of something very special happening.

Cloudbusting takes place on Friday, September 14 at the GMIT campus in Galway

So You Want to Break Into the Games Industry? Here’s How…

You have a passion for computer games, and you think you want to work in the computer games industry. How should you go about it? You could listen to the advice of Ian Schreiber for a start. Ian has worked as both a programmer and game designer, as well as teaching game design and development at Ohio University. He recently shared some tips with students and young researchers involved in the games area about how to get that ideal games job.

If you’ve ever been a student in college, you probably know that there are always a variety of motivations for how colleges work and what they should ideally do. Student success is the primary one, but that success may not entail you getting your dream job in the career area of your choice. However, what you do in the lead up to that job hunt can help you maximise your chances of reaching your goal. There are two main parts to this: (1) knowing what your goal is; and (2) figuring out how to get there.

For (1), knowing your goal, those already in a games degree programme probably already have a good idea of what the job entails, but for others it may be more tricky. A typical conversation would be: “I love playing games, so the thought of making them sounds really cool.” “Are you a going to be a programmer or an artist or a game designer?” “Oh, what’s the difference?” You need to understand that first before you go any further.

For (2), getting there, the games industry is fairly straightforward in terms of what they are looking for. In the main, they just want to make awesome games. “Awesome” differs from company to company, whether it be a well-reviewed game, one that’s got great gameplay, or one that makes lots of money. They basically want to know if you can be part of the team that can help them make that awesome game.

So, the challenge is showing them that this is something you can do. You need to provide credible evidence that you can do it. How? The most obvious way is simply by making games. If you’re not already making computer games on your own because you love doing it so much (and you’re already in a games course at college), then you might want to consider changing degrees because what do you think you are going to be doing full time after graduation?

If it’s not an activity that you will love doing, Ian said you should re-consider going into it as a career as the pay is lower and the working hours are worse than some other similar careers. (Check out Glassdoor to read about validated anonymous people working at various companies, either praising their companies and the benefits, or spilling dirt on their employer and why it sucks to work there. For example, Valve gets good reviews.)

You need to decide if this really is a career you want as it’s better to find out as an undergraduate before going to industry and burning out. Five-and-a-half years is the average length for a career in the games industry before burning out (that’s a full career, not a single job), so you may want to go do something else. But, if you listen to Ian’s advice, do your research first, and still eventually go into the games industry, you will probably enjoy it and it may well be the best job ever.

Ian is co-author of the book Breaking Into the Game Industry: Advice for a Successful Career from Those Who Have Done It. He wrote the book with another industry veteran, having himself worked in the games area for 12 years. As part of his research, the authors asked a series of games industry leaders to provide paragraph-long answers to questions being asked all the time, and the resulting combination of answers has worked a useful guide for job seekers in the industry.

Ian cited personal experience in his quest to become a games designer as opposed to just a programmer. Having programming skills is useful because if you don’t know what’s easy and what’s hard to code, your game designs will be brilliant but impossible to execute. Games companies are also very cautious in hiring designers since a mistake on the part of the designer can have serious repercussions that can bleed across departments. It’s a position of trust, and if the company already has a designer, they tend not to want to give that trust to anyone new.

To get into game design, you have to “play nice” with others: start to work with game designers, approach the work very carefully, show some design prototypes you did on your own or some ideas you had that got into the final version of a game: basically, build up some evidence to show that you can be a good game designer too.

There are other ways to demonstrate that you have a range of non-technical or soft skills that a company is looking for, whether it be relevant non-technical subjects studied (that history minor may be relevant for historical games) or your ability to work in a team. Show that you have a track record of working on a team with other students, and if the opportunity arises, try and take leadership positions in games being developed in or out of class. It’s good to show that initiative: that you are capable of doing things without being asked or required to, for example, by showing that you made games outside college on your own because you wanted to.

There are some in the games industry who claim that they would rather not have done an undergraduate course, but instead would have spent every moment teaching themselves how to make games and doing nothing else. Ian disagrees: college makes you more rounded and helps with breaking into the industry. The most useful thing about college and spending four years in a safety net from the outside world is that you have this time to experiment on games projects and ideas that you couldn’t get away with anywhere else – and you can do it without costing a publisher $3 million dollars on a failed project. You also have a bunch of like-minded people in college with similar interests and career goals, and that’s a huge resource you can make use of.

Ian also talked about the difference between entertainment games and serious games. Jobs creating serious games are a lot less competitive than the entertainment games industry, and with fewer applicants it can be easier to get your foot in the door. But attracting less people means that serious games tend not to be as well polished as entertainment games. The area is really challenging and interesting, and serious games are certainly harder to make than entertainment games. They not only have to be fun or profitable, but there is also that additional purpose that weighs down on you like a giant weight. It can be very rewarding to be able to say “my game helped end a war” or “my game helped save 500 lives”. Ian advised those interested in serious games to attend events like “Games for Change” or the Serious Games Summit.

He stressed the importance of going to games conferences and networking, as this is very important in the games industry. The saying “it’s what you know” is better put as “it’s what you know AND who you know” for the games industry, as you have to know the right people if you want to get that ideal job.

If you haven’t built any complete games, mods (modifications to existing games) can still work well in a portfolio, especially if you can point to it and say that you thought a particular game was good, but this was a weakness you found after your analysis of the game design, and this is what you did to capitalise on that and make it better (it needs to be more than a funny-shaped level).

One valuable piece of advice from Ian was not to throw in everything you’ve ever done into your portfolio. Your portfolio should be your strongest stuff, because the entire set is only as strong as the weakest link and should show the best you are capable of. Don’t pad it out with early work like that badly-drawn polygon animation with lens flare. You need to put in your work that shows what you can do – whether it be mods, design documents or full working games. Of course, this depends on the company. Showing a Half Life mod when applying to Valve will carry a bit of weight!

If you’re a budding programmer, you may also wonder about the demand for those with artificial intelligence experience at undergraduate or postgraduate level. If you can show working games with some AI, this can be pretty compelling, but the downside is that not everyone needs an AI programmer (certainly not FarmVille?), and the academic notion of AI often differs from real gaming requirements. The perfect academic AI will win in the best and most efficient way possible; the gaming AI will put up a good fight and maybe lose, but it will be fun to engage with and demonstrate intelligent play to make the game feel more awesome.

There’s also the commonly-asked question of how those in the games industry can balance their time playing for fun and making games. Making games is very demanding and time consuming. You could spend up to 16 hours a day to get that next milestone out the door, and may not get much time to play. But as a professional game designer you need to play games because you are doing “research”. As seen on the show Extra Credits, there’s a difference between playing as a designer versus playing as a player. As a designer, while playing you are analysing your own play. “Oh, I’m feeling joy with this level. Why is that?” It’s a bit like a professional comedian dissecting another’s jokes: something is lost along the way. For a designer, shutting off that analytical part of brain is very hard, but you can still play games that are different in nature to those you are making and enjoy them.

And that’s what it’s all about in the end of the day. Someone somewhere has made a computer game for your fun and entertainment. Hopefully you can do the same for somebody else someday soon.

Ian was speaking at the International Games Innovation Conference, run by IEEE’s Consumer Electronics Society.

From Arcades to Apps: The History of Computer Games is Repeating Itself

Image adapted from Wikipedia.

Seamus Blackley, co-creator of the Xbox, has a theory. The new arcade is the tablet, the mobile, the app-powered touchscreen device of today. What we are seeing today in games apps has all happened before: we just need to look back to the arcade games boom of the early 1980s, in particular their adoption by a widespread demographic. But we also have to learn from the arcade games crash and make sure that the same doesn’t happen to the games apps ecosystem.

Blackley was the keynote speaker at the International Games Innovation Conference, run by IEEE’s Consumer Electronics Society, where he spoke about the birth of arcades and what it means for those now in the games industry. His new company Innovative Leisure has recruited a venerable team of arcade game veterans to build arcade-like games for touchscreen devices. He is also known as a transforming force in the games industry, revolutionising how many play games today when his team at Microsoft articulated a vision for a games system powered by a personal computer – the Xbox.

A self-confessed games nut who got into the games business because he loved video games more than anything else, Blackley felt so compelled to make video games that he was inexorably drawn in. As he says himself, at one point he woke up wondering what he was doing in the industry, what it meant, how could he make a success for himself there and how would he explain this new industry to his parents or to friends at parties (although this became easier as games became more mainstream). In the 80s, you got a blank stare for being a games designer, and many were unaware of the computer technology powering these entertainment devices. There was a curious and refreshing cultural disembodiment as people responded to games like an entertainment medium and not a technology. “Non-computer” people had permission to play games as they became a widespread cultural trend: they weren’t a geeky activity as computer culture was only just starting.

Before the birth of the computerised arcade games era, the earliest electromechanical arcade games like pinball were a wonder to behold. In fact, they provided the context for computerised arcades because without them the audience wouldn’t have appreciated the leap in gaming when the first video arcades were released. Computer Space, shown on the right hand side, was the first commercial video game to be sold in 1971, based on the Spacewar! PDP (mainframe) game from the 60s and displayed on a TV vacuum tube. Similar to Asteroids, it featured an animated starfield with flying saucers shooting at the player’s rocket ship. What was novel was that the player’s bullets could track a ship and could also be controlled by the arcade buttons. But many still wondered what this thing was and why no TV shows were being displayed on this tube-like screen in a big box. Computer Space was eventually a failure because it was too much and too complex: people just couldn’t figure out what was going on with it. Pong came soon afterwards, in turn inspired by the earlier game “Tennis for Two”, and through its simplicity it achieved more widespread success.

There was a great sense of entrepreneurial spirit in bringing these arcade games to the masses, but there was a terrible problem unanticipated by the producers: copying. They hadn’t trademarked their games (why should they?), and Pong became so successful that it was copied multiple times. So, what to do next? The arcade game producers hired teams to come up with ideas and play around with them, going beyond the different manifestations of Pong to produce driving games, flying games, etc. Games started appearing all over the place, and the instantaneous growth in the scope and range of arcade games in late 70s and early 80s was completely extraordinary (sound familiar?).

At its heart, the arcade game industry was essentially a refrigerator manufacturing business, but the market was huge. Asteroids alone was a $4 billion business, producing over 80 thousand cabinets in the 1980s. The Battlezone Asteroids-type arcade game was a technical design disaster by today’s standards: high voltages inside the case, fluorescent lighting, plastic shrouds, and featuring a 400-pound cabinet in case people would try to steal it (and people did, stealing pickups that were used for transporting the games and leaving the pickups behind). The arcades were extremely profitable: these cabinets would make $400 a week for an Asteroids-type machine.

To illustrate the growth of this industry, in 1978 the US domestic games business was $50 million. Three years later, there were $900 million in sales of cabinets and $5 billion was spent on these arcades in quarters. In 1982, this figure rose to $8 billion in quarters ($19 billion in 2012 money). Atari at that time was the fastest growing company in the history of the human race (Blackley referred to articles in Business Week from that time and how you could almost replace the name Atari with Facebook to produce modern articles word-for-word). To give context, in 1982 the music industry was worth $4 billion and the movie industry was $3 billion. Pac-Man itself eventually became an industry on the scale of the entire movie business at the time.

Nowadays, people often compare these primitive games with fully-featured gaming environments like Modern Warfare, but forget that today’s games are being launched into a very mature and games-aware audience. Also, the games of the 80s weren’t just being played by a niche of gamers, but rather by a universal demographic of people. For those amazed by the wide-ranging demographics of those now playing games on mobiles and tablets, this really is not new news. There are other smaller similarities: the achievement badges with high-quality designs and artwork from arcade games like Asteroids or Gravitar are very similar to those given out on XBLA, PSN or iOS games today. The games trade shows are just as silly as they were back in the 80s when they were invented. And there’s even some cosplay!

What we are facing now is not a brand new situation that no one has ever seen before: there has been no sudden horrible change in the demographics of the world that is causing consumers to behave in some insane way as they take up gaming. We again have a culture that gives permission to play games just like it was 1977. You can be enthusiastic, you have permission to be a gamer, and companies are again talking to a whole audience of people that they haven’t been able to talk to in nearly 20 years. It is interesting to see the corner being turned again, but there is a pattern in human endeavour that has dogged us since we started keeping records.

A new idea is introduced and sees initial success. People get accustomed to it, but then we lose the context for that idea, it declines, and it takes a long time to build back to where you were (there are numerous examples of this from TV or movies to computers). Games also had that effect in the 80s: players with high scores became virtual heroes appearing on talk shows, and there were TV shows consistently at the top of the ratings with kids just playing video games and audiences cheering them on. People got sick of it, and games went away to become more of a hobby interest, with the marketing of games being targeted towards this hobby audience.

Now, with games re-emerging from their hobby audience demographic back into the mainstream, the danger returns. The need for novelty in games begets the demand for a range of games catering to different tastes, which in turns leads to exploitation and over production, with the inevitable crash. Unfortunately, the video games business did an excellent job of crashing itself in 80s. As an example, apparently there were more cartridges produced for the game ET: The Extra Terrestrial than there were Atari 2600s to play them on (many are apparently buried under concrete somewhere in New Mexico). Everyone knew it was crazy, but games were so extremely popular that they felt that they had to do something like that. Blackley refers to this Atari internal memo from Innovative Leisure colleague Rich Adam where he bemoans the impact of what he terms “License Fever” on the quality of video games. If you start to feel that you need to exploit a business because of its scale, you are beginning to disrespect the customer and will crash yourself.

The way that people purchase and play games has changed radically recently. Much has been made in a variety of media articles about the death of consoles, about social media taking over world or the death of social media, and so on. Facebook has changed way that we think about talking to customers online; iOS has changed the way we think about marketplaces and digital downloads; Amazon has changed the way we think about hosting our content and data. The world is changing, but we can still try to engender that feeling of specialness in getting a game for the first time. This is when a teenager drives all the way to a store to get a new game and spends $16 on a plastic disc because they love the medium so much. Blackley advised us not to squander that, to remember how much we love games and to recall that moment when you first saw a game that was really special, that changed your life. He wants game producers to focus their efforts on recreating that and passing that moment on to the audience. A love of gameplay, a spirit of innovation: these are the things that makes the video games industry a really good business.

Just as I was writing this article, Seamus Blackley coincidentally wandered by and we had an interesting chat about the the origins of his name (while working at Looking Glass Studios as Jonathan Blackley, his colleagues gave him a new name – Seamus – that he adopted informally at first and later formally through a name change). He asked me to mention in the article that he was a mean bastard, but actually he’s an inspiring guy. Thanks Seamus!

Dublin Hosting Important Seminar on Data Protection

It is essential for any business operating in the modern world to have knowledge of the data protection laws and how it is affected by them – even if in some special circumstances they would find themselves to be exempt. If valuing how much confidence customers and clients place on how a company treats their data is not enough of a motivation to raise standards then the advent of severe penalties in the coming years should certainly make business owners and operators pay more attention.

According to Lisa Jackson, ICT Solicitor at Leman Solicitors in Dublin, (and a former colleague of ours from a previous incarnation of Technology Voice);

“New laws come in the period surrounding 2015 and if you are found not to be in compliance with them then the penalty for that could be 2% of your global turnover.”

In addition to these rather steep fines there is the possibility of intensely negative PR that could ruin a person’s business reputation for many years into the future if not for life. Lisa says that depending, “On the nature of your breach you could end up in the papers. Certainly, if the Data Protection Commissioner (DPC) catches wind that you are not in compliance than he can come down and investigate you.”

Data controllers who repeatedly or seriously run afoul of the relevant data protection laws can find themselves featured in case studies on the DPC website with the possibility of having their names mentioned again in the annual reports. The DPC are aware that there is a process of education and awareness that needs to take place so there is no formal naming and shaming policy.

However, being known as a person who knowingly flouts data protection legislation would have a hugely negative outcome for a number of reasons:

  • Newspaper journalists who check these posts regularly may publish stories which can only reflect badly on a company.
  • It opens a path for litigation from disgruntled users.
  • It may put a real cap on growing the business as future investors will be reluctant to be involved with transgressors of a law that deals with such a sensitive issue.

In January 2012, The European Commission proposed a comprehensive reform of the data protection legislation. This was largely in response to how massive technical progress and innovation has been since the creation of the original original Data Protection Directive of 1995.

These directives define essential principles concerning data protection that allow law-makers in the EU to formulate and enact the appropriate legislation. For data to be secure there first should be clarity about what constitutes secure data and clear protocols for the handling of private and personal information.

Lisa points out that, “Without legislation and without any sanctions to penalize companies or guidelines to set out the best practice for companies to follow when protecting personal data they hold belonging to someone else then there would be no effective security at all and this would have a huge impact on personal privacy.

“The law just sets out the ground rules. It is the basis on which everything else is built.”

Although, in Europe the regulations of individual countries are not closely harmonized (each of the member states have been allowed to implement the directives a little bit differently,) things are very different for a non-member nation.

“Europe is the most secure place for data.” Lisa says, “It is the place where the rules are most stringent. When transferring data to and from America, Americans need to actually step up to our laws.”

“If they don’t adhere to those rules then they are not allowed to transfer data.”

While some may view yet more legislation as burdensome there is a tremendous upside for complying with both present and future data protection regulations.

“You are proving that you have a certain amount of security around your data and that you put a certain amount of thought into your systems.”

In July 2012, Lisa was involved with a seminar that was ostensibly concerned with software licensing. However, according to Lisa, in the ensuing Q&A session, four out of every five questions was about data protection.

“Pretty much every one in the room had a question about data protection. I think there is a lot of misinformation out there and people don’t know how to comply.”

To fulfill the clear demand for more information about data protection Lisa has organized a special event which she will be chairing,
LOCK UP YOUR DATA – Data Protection Panel Discussion”
. It is free and takes place on Friday, 14 September 2012, 2:30pm – 4:30pm.

One of the speakers will be the Data Protection Commissioner, Billy Hawkes.

Apart from simply discussing aspects of the relevant legislation at hand Lisa also hopes that the event will be seen as an important and positive response to the seemingly all-pervasive talk in the media about leaks and breaches where security hasn’t met the standards that it should have.

“Why not have an event that was positively focused and gave useful information on how to protect data for small to medium companies so they can learn to comply with their obligations rather than just hearing about all the negative press surrounding data security breaches and failures to uphold data protection obligations.

“This is also an event where you can informally ask the Data Commissioner the questions that concern you.”

But even if you should you find yourself in the unfortunate position of being unable to attend Lisa makes this particularly pertinent suggestion for companies starting out and have not yet registered with the DPC.

“The important thing to do when building your company from the ground up is to start thinking about your obligations at that point. Not to leave it two years down the line, when you haven’t registered and you are now going cap in hand to the Data Protection Commissioner to excuse yourself for not having registered for the previous years when it is so easy to do and it is so inexpensive.”

Useful Links:

Ireland – Data Protection Commission

UK – Information Commissioner’s Office

US – Safe Harbor Framework Basically a self-certifying process organized by the US Department of Commerce to enable companies to comply with EU data protection regulations.

LOCK UP YOUR DATA – Data Protection Panel Discussion Free: Friday, 14 September 2012, 2:30pm – 4:30pm

Ireland Ranked 10th in Study of Web’s Use, Utility and Impact [INFOGRAPHIC]

Ireland was ranked 10th out of 61 countries analysed in the World Wide Web Foundation’s “Web Index” published today. The Web Index is a ranking of the political, economic and social impact of the Web on people in developed or developing nations. In the European region, Ireland was ranked 6th, ahead of France and Germany. At Technology Voice, we have produced some infographics showing Ireland’s place in this index.

This first picture shows the main contributors towards Ireland’s high ranking. Economic impact was highest. Our political impact and communications readiness received lower scores, but political impact had a higher weighting.

Ireland’s place in this scatter plot shows how we may need to improve communications readiness for a higher proportion of web users.

Ireland was ranked top in terms of economic impact of the Web. This score assessed the utility of the Web and its impact on business and the economy. Our ICT service exports as a percentage of GDP were a strong factor in this, followed by good web use for agriculture.

This matrix chart shows the various top-level categories and sub-categories evaluated and Ireland’s place therein. Click here for the full version.

Finally, we compare Ireland (shown in gold) with Sweden (at the top of the index, purple), the Russian Federation (in the middle of the index, pink) and Yemen (at the bottom, green).

Infographics generated using IBM Many Eyes and these datasets.

Can We Have One Raspberry Pi Per Child Please? [REVIEW]

Although it is just slightly over a year since the Raspberry Pi low-cost single-board computer first made its public debut (when an alpha version of the board was shown running the Quake 3 game and playing full HD video), five thousand news articles and 30 million Google results later, tens of thousands of Raspberry Pis have been shipped around the globe. It’s not alone: the MK802 (which we will be reviewing shortly), CuBox and Cotton Candy are part of a wave of mini computers that are putting low-cost computing into family living rooms and student dorms.

The Arduino single-board microcontroller has found widespread application over the past few years across a range of domains from smart clothes to interactive play. Similarly, the more powerful Raspberry Pi is being used for ideas such as a photo souvenir printer and a touchscreen for seniors and late adopters. At Technology Voice, we’ve been testing out Model B of the Raspberry Pi, received this week from Allied Electronics. At $35, you get a basic computing system that just requires a USB keyboard/mouse, HDMI screen, power lead and an SD card pre-loaded with the operating system before you’re up and running.

We started by downloading the Raspbian operating system (a version of the popular Debian Linux OS, optimised for the RasPi) and flashing it onto a standard SD card, then plugged in a Mac keyboard and mouse, a full-size HDMI lead, ethernet cable, and a micro-USB cable for powering the board from our TV (you can also use a DC adaptor if you have it). Our TV complained a little about the USB power requirements but we soon saw a familiar Linux boot-up screen. The first thing you are shown is a configuration menu to do things like expand the operating system’s ‘root’ partition to the full size of your SD card, change the keyboard type (it had my aluminium Apple one listed), set the locale and time zone, etc. You can choose to boot straight into desktop mode after startup to bypass any login prompt.

Raspbian comes with a lightweight desktop featuring a small selection of apps: a browser, Unix terminal, Scratch interface (a programming system geared towards kids and educational use), and a Python program editor with some Python-based games. For those who have tried one of the other mini computers like the MK802, this may seem pretty barebones, but there are some other more fully-featured Linux distributions for the RasPi. There is also some good news in that a port of Android Ice Cream Sandwich is being made available as an alternative operating system for the RasPi, bringing with it easy access to a range of Android apps produced for mobiles and tablets.

We tried out the default web browser Midori and it speedily loaded the Google home page; our more graphics-laden Technology Voice page was somewhat slower. To test out how the RasPi performed under load, we fired up a range of applications including the browser, a Python game, a terminal, file manager and some system settings. The system RAM was quickly used up and the CPU maxed out at times, but the RasPi was still pretty usable.

There have been a bunch of ideas on the best uses for the Raspberry Pi, but a personal proposal is to port the One Laptop Per Child (OLPC) idea here as a “One Raspberry Pi Per Child”. Having spent the afternoon with my six-year old playing around with Scratch for the first time, I am convinced there are great opportunities for the RasPi as an educational tool, either at home or in a school setting. Apparently a country in the Middle East feels the same way, proposing to give a Raspberry Pi to every schoolgirl in the country. As well as Scratch, there are a range of applications that could be made available or ported to the RasPi, including Chris Ball et al.’s WikiBrowse for the OLPC (in turn based on code released by Patrick Collison), Kojo and ToonTalk.

Based on the last census’ population figures for Ireland, we can estimate that there are perhaps 555,000 children aged between 6 and 14 in Ireland. It would cost 15 million euro to buy a personal Raspberry Pi Model B for every one of our children. It’s a large sum, but not an impossible one. Not everyone has a HDMI-ready TV or keyboard/mouse to hand, but these could be shared in schools and homes. So how about it? ORasPiPC anyone?