Year in Review: Our Top Five Most Popular Articles

It is not our usual way at Technology Voice to predict winners and losers in the uses and exploitation of technology and innovation. We see our task as being to find out what is new, relevant and interesting and tell others about it. About 40% of ‘these others’ are readers who come from the business world matched by a roughly equal amount that come from the world of academia, particularly in the areas of reserach. Many of our readers are from Ireland but a significant amount also come from overseas.

However, from our position as observers on the sidelines we can’t help but notice trends and patterns in both the marketplace and research laboratories. These trends are reflected almost isomorphically in the relative popularity of our articles.

As is customary at the end of the year we shall round up with a list of our top five most popular articles. It can be seen that their relevance to our readers acts as commentary and reflection on activities in the tech world in the last year.

It is clear from this selection that networking technologies of all sorts are of predominant interest.

The most useful metric for determining relevance that we have at our disposal is the popularity of a given article or blog measured in direct hits on the relevant article’s page. But we should offer caution here that we know, but cannot quantify to any degree of accuracy, that our articles circulate out across the digital landscape in ways that cannot be easily counted or assessed. So, the following list is based on a first order of popularity that we were able to quantify ourselves.

(Finding a way to accurately track the dissemination of URLs would bring the inventor untold wealth from the world at large and eternal gratitude from me.)



Tapmap: Navigating Offline Store Inventory With Online Technology This is the business article that drew the most interest this year. So, congratulations to Philip McNamara.

TapMap matches a request for a product through a mobile device with the products availability from a given supplier. This saves the customer from having to traipse around from place to place or even site to site.

TapMap also helps smaller retailers fight agains the default shopping mode that a lot us have which is the tendency to just go to a larger merchandiser on the assumption that ‘they will just have it.’



Starfish: A User-Controlled Network

With the massive coverage of Wikileaks and the activities of Anonymous this was an article that seemed to touch the current zeitgeist. The opportunity to move to a decentralized method of distributing information using currently available technologies offers us an opportunity to slip the shackles of Big Brother and the Telcos.

Of course, our communications technology, both hardware and software, comes from somewhere and that somewhere has to remunerated in some way but the flatter more egalitarian distribution system put forward here has a lot to offer in terms of efficiency and robustness.

It seems almost inevitable that this sort of networking will be implemented in some fashion.



Data Mining: Using Predictive Analysis And Social Network Analysis

Although Herculean, the movement and storage of petaflops of data across the planet and occasionally beyond has been managed with relatively little obtrusiveness to our daily lives. The internet could double in size over the very short term and those of us who weren’t equipment manufacturers would barely notice.

But the value of data lies in its relevance and usefulness to a given purpose. It has to have meaning to someone or something. Discerning the meaning of data and its significance to other bits of data is the work of Eric Robson who leads the Data Mining and Social Networks Analysis Group at the TSSG which is based at the Waterford Institute of Technology in Ireland.

Without doubt the TSSG is one of Ireland’s gems and with its focus on the commercialization of research we will be hearing much more from and about them over the coming years.

Data mining is relevant in every area of life where it is important to match seeming disparately bits of information not only to tell us what is going on but provide us with predictive ability as to what may happen over some future period of time. Applications range across the whole supply management and distribution of services and to areas such as law enforcement.



Bio-Inspired: Electronic Chips Emulate Workings Of Neuron

Our second most popular story of 2011 was from, in all places, just across the NUI Galway campus from our office.

Dr. Fearghal Morgan, Dr. Jim Harkin and Dr. Liam McDaid have used the natural architecture of the brain to create an electronic system that emulates some of the workings of a neuron.

I am particularly pleased about this one as I have had a long-standing interest in the work of Jeff Hawkins, formerly of Palm and Handspring, and his development of software architecture and processes that parallel the working of the brain at Numenta.

While ‘brain as a computer’ is a limited metaphor there is no question that there are processes in the various parts of the cortex that lend themselves very well to emulation on a micro-processor.

Our brains are the most complex data handlers that we know of but the ability to utilise technology that has been worked on for hundreds of millions of years offers us a wonderful opportunity to find new, better and more efficient ways of doing things.



Crowdgather

By far our most popular post ever and is indicative of the difference between the hyped web and the ‘real web.’

Facebook deserves to be written about both as a cultural phenomena and as a constantly evolving technology that through trial and error – whatever happend to Deals and email? But it also provides us with a great opportunity to discover how users participate and communicate with each other by digital means. (Let’s leave the walled garden argument to one side for the moment. With nearly 800 million users it is moot now as to whether it is a barrier in the cultural sense rather than the technical sense.)

However, the web is expanding faster than Facebook is. As a thought experiment if you were to somehow be able to stand at the edge of the expanding web, due to the different speeds of growth, if you were to look at the space that Facebook occupies it would seem to be getting smaller in a relative way.

Stumbleupon has amazing growth and activity figures which parallel Facebook’s but have not incurred the massive press coverage that Facebook has. Most people, even those not on the web, could tell you that Mark Zuckerberg is the main man at Facebook. Try a pop-quiz with friends and relatives over the holidays and see how many could name his counterpart at Stumbleupon.

But this focus on Facebook is distorted and is not an accurate reflection of what is really happening in the field of online communications.

Bigger than these two in terms of activity and sharing is the world of online forums. Our interview with Sanjay Sabnani illustrates some key points and here are a number of quotes from the article:

“What matters on a forum is the worth of your intellect, the merit of your thoughts and your ability to communicate them.”

“Forums are designed for a multiplicity of people to communicate with a multiplicity of people and they are done in an organized fashion with a taxonomy that makes sense.”

“What forums allow you to do is the sum total of everything you can do on the internet.”

We would like to take this opportunity to thank our regular contributors; Conor Harrington, Lisa Jackson, Ina O’ Murchu and John Conroy for the help they have given us this year.

We would also like to thank Tom McEnery, Rich Moran and Aoife Connelly for their valuable contributions which we are very grateful to have received.

Vital++: Television Content through the Internet

Internet Protocol Television (IPTV) is a mechanism by which televisual content is delivered to your television set by using the internet. Instead of receiving programs via an aerial or through cable you will be able to have them delivered through your broadband connection.

The conventional delivery of programming via an aerial or through a cable requires the material to be sent more or less as a continuous stream from the transmission center to the television screen. But not all content uses the available bandwidth equally.

An action/adventure feature film, for instance, can have considerably more audio and visual data at a given point in its timeline then then say, a studio discussion with two talking heads. Providers have to allow for these high density transmission rates by increasing the designated bandwidth but as a consequence, when not so much audio/visual content is present, unused capability is created in what the telecomm operators call the reserved space.

The most effective way to make more efficient use of data bandwidth is to use peer to peer (P2P) file sharing. Instead of everything having to come down one pipe, so to speak, data is stored across a network of PCs. The P2P software locates the nearest piece of content as opposed to continually trying to locate content from a centralized server.

However, as it presently stands, a lot of P2P file sharing is highly problematical for the network operators. A significant proportion of file sharing activity is in the trafficking of unlicensed and unpaid for copyrighted material.

A solution to this issue of unregulated transfer of data is Vital++, a project with multiple partners across Europe. These range from the Fraunhofer Institute in Germany, where the MP3 system was invented through to Telefonica, Telecom Austria and partners in Greece who are expert in P2P networks.

The Irish effort in this project emanates from the TSSG who are based at the Waterford Institute of Technology. Shane Dempsey is one of the project members.

“If you were broadcasting a popular football game using the Vital++ system you would still have to reserve some bandwidth to make sure the overall performance of the system was adequate. That the viewers who were watching the peer to peer distributed TV were getting a good experience.

“What we are trying to do is minimize the amount of expensive quality of service reservations. We use pretty advanced algorithms for distributing the content based on statistical usage properties.

“With the Internet, it’s generally best effort. That’s not good enough for high definition TV but it’s free, or very cheap. So what we are trying to do with our peer-to-peer network is to use as much best effort as possible and then as a last resort reserve quality of service.

“We have to address the mechanisms that were available in the network to reserve bandwidth as required by the peer to peer network. So, we had to come up with a way to do that. We had to find a way to monitor a very large number of users who could be part of the content distribution overlay. An overlay is basically a group of users who are involved in distributing content to other users and receiving it themselves.”

In contrast to a centralized content service where the user just pays a subscription fee, being able to monitor what is going on throughout the content overlay and throughout the group of users participating in the service allows for micro-charging to be made possible.

This makes it much easier pay-as-you-go and and pay-to-play services to be made available.

Another aspect where the TSSG provides its expertise is contributing towards additional functionality in the Vital++ project to allow the building of licencing and content management infrastructure across the reserved spaces or reservation. This enables the appropriate usage and licencing of fair-use content for educational and other purposes.

The standardization process is still going on at the International Telecommunications Union. In two or three years we should actually start seeing some of these mechanisms being used in IPTV systems. The next stage after that would be to have IPTV sets in our homes.

Feedhenry: Building Apps in the Cloud

Feedhenry is service that allows developers to build an app from a single source code base; have it processed in the Cloud and receive an output that is configured for use on the relevant major mobile phone operating system of choice – iPhone, Android, Blackberry, Windows Phone 7 or Nokia Web Runtime.

Applications are built using JavaScript which runs on Feedhenry’s open standards technology platform. Developers can construct their apps online through a browser or they can download a software development kit if they wish to work off their own computers.

Micheál Ó Foghlú is the CTO of Feedhenry and explains further, “It is an entirely hosted environment. The development platform is hosted as well as the server side cloud where the apps are deployed.

“On that website you edit html, javascript and CSS and you press a button and it spits out a binary APK file, for instance, that you can just upload on to your Android phone. Or it spits out a binary file that you can upload to the Apple Apps Store to put on your iPhone via itunes.”

Having a single source code base means that developers and client can work in a single cross-platform development environment and avoid having to write and update their app a multiplicity of times for multiple platforms.

Micheál cites an example of what can happen in the normal development process, “Usually the first thing a client may say is, ‘We want an app.’ Then they say, ‘We can hire these software developers to builds us an app.’ Then they say, ‘Hey guys, we’ll give you 5k, 10k, 20k. Can you build us an iPhone app?’

“The guys build them an app. Then they say, ‘Android have shipped more units in the States than iPhones did in 2010. Maybe we should have an Android app as well.’

“Typically, they go back to the same developers and say, ‘Can you build us an Android app?’ And they say, “No, we’re all Objective C guys. We know how to build iPhone apps. We don’t know how to build Android apps.’

“Then you have to pay another bunch of guys 5k, 10k, 20k to build the Android phone app and so on for Blackberry, Windows 7, Nokia Web Runtime.”

But the story doesn’t end there, “Then you have to update the app. So you have to pay the first guys to update the iPhone app, pay the second guys to update the Android app and so on across the handset operating systems.

“It becomes a nightmare in terms of code bases for what in terms of logical business functionality is a single functionality.”

The server side element which is hosted in the cloud allows easy back end integration into enterprise services. Sophisticated apps can be created using standard web technologies and be integrated into existing business and IT applications without any additional investment.

The three main market segments the Feedhenry team are looking at are enterprise solutions, telecoms operators and independent developers.

Its application delivery platform can deliver the same application interface to both smartphones and social media sites.

The original work on Feedhenry was done by the Telecommunications Software and Systems Group (TSSG) in Waterford, Ireland back in 2008 and now operates as a pay-as-you-go service with six full-time staff.

According to Micheál, “The purpose of TSSG is to try and be excellent at basic research, applied research and commercialization and to have a balance of all three. What Feedhenry represents is one our best attempts at having a real commercialization impact — to have a commercial spin out of the technology.

“It was a strong enough technology at the right time in the market place to make it worth getting investment and spinning out to make a play in the market place.”

SOCIETIES: Combining Pervasive Computing with Social Networks

SOCIETIES is a project to bring together social computing and pervasive computing into one overall framework that can be deployed to allow third party developers to provide next generation services beyond what is possible today. Pervasive computing is about making technology disappear into the background so that users can remain unaware that technology is acting on their behalf.

Pervasive computing uses information derived from the array of sensors and devices that make up the context of our digital lives. Context could be your location at given times of the day, the number of cars on the road or the weather. Any information that can be digitally discerned from our actions and interactions with our environment and that can be turned into data provides the context in which pervasive computing can work.

This context information can be combined with an individual user’s personal preferences for how they want technology to act on their behalf and how they in turn interact with technology. That enables them to make proactive decisions for the use of these services and be able to obtain a more personal and relevant experience.

The SOCIETIES project launched in October 2010 and there are sixteen partners. Eight are academic partners and eight are industrial partners. It is funded under Europe’s Seventh Framework Programme (FP7.) SOCIETIES is the largest integrated project out of the fifth call for project submissions for FP7 and it is the only one that has been coordinated by an Irish academic partner. In this case the Telecommunications Software & Systems Group (TSSG) which is based in the Waterford Institute of Technology.

Kevin Doolin, Chief Engineer and Chair of the Scientific and Technical Board at TSSG, explains further, “We are merging pervasive computing with the whole area of social networking and social computing.

“The key thing in SOCIETIES is that we would be providing services that are context aware on behalf of an entire community of users rather than the individual users that have been catered for up until now.

“You have these smart phones, smart cars and smart offices but all these entities work in isolation. There is no real interoperability between the different smart environments that are out there.

“What we are doing in SOCIETIES is building a framework and bring these smart-spaces together so they can interoperate.

“You can have your own smart-space which would be you and all the devices that you own connected together. You could walk into a smart-office and your smart-space could actually connect into that smart-office environment, for example. And you can get access to various services within that office. It could also be in a supermarket or at the side of the street or anything else.

“With SOCIETIES we have taken that quite a step forward. We are dealing with communities of users and providing services for multiple users at the same time. There are many issues there. First of all, ‘How do you find the users that would form a community?’

“We could do that by social networks. We can mine data out of social networks. For example, if you go to a conference and you have an interest in pervasive computing and you’re subscribed to the SOCIETIES system we could then find out how many other people at that conference have similar interests to yourself. We could then join them to you digitally and share whatever you want to share; data, experiences, business cards, and everything would be done dynamically.”

It is not such a big step to go from the idea of technology being pervasive to the idea of it being intrusive. Privacy and security are notions that are still highly valued by many people despite claims that the Age of Privacy may be over.

As Kevin says, “Security, trust and privacy are critical issues to deal with. Everything I have said so far sounds like Big Brother, monitoring the users, following them everywhere and knowing their every move. But we can only do that if the user is happy for us to monitor them like that. We have a lot of research that has been done into users privacy requirements, security requirements and trust requirements.

“Using a social network as a context source isn’t something that is done at the moment. Combining that with personal preferences for a group of users is something that is very complicated to do. On top of that we have what we call a work package that is dealing with the personalization of services and taking proactive actions on behalf of the user. Then on behalf of the user within a community of users.

“Part of the challenge there is to actually learn about the users, their behaviours and how they interact with the technology and the services that are available.

“The integration of multiple different device types. Everything from your phone to your laptops, your digital photo frames, your fridge could potentially be integrated into this framework we are going to develop. So trying to develop an abstraction layer that will allow all these different devices to communicate and operate together is another one of the challenges we are going to face.”

Bringing together pervasive computing, the handling of data from derived from sensors in the environment and social networking technology is a daunting technical challenge. To accomplish this goal technologies will have to be created and developed that don’t exist yet. The future will have to be invented.

Data Mining: Using Predictive Analysis and Social Network Analysis

Data mining is the extraction of information from raw data. It describes the attempt to find hidden patterns within the data and determine what they might mean. Eric Robson leads the Data Mining and Social Networks Analysis Group at the TSSG which is based at the Waterford Institute of Technology in Ireland. It is a small group mainly concentrated on the commercialization of research.

They look at data in two main ways; predictive analysis and social network analysis. The general approach in predictive analysis is in the classification and grouping of users, customers or subscribers.

These tools are tend to be used by large organizations such as supermarket chains and telecoms operators. They are expensive in terms of software and hardware and expensive in terms of the people who need to operate these systems.

Eric explains further, “For instance, a large supermarket has many thousands of customers and many thousands of products to sell. Usually each customer is tracked via their charge card or their club card and we are able to see return visits.

“On day one, a customer might buy bread and some butter. On day three, they buy some more bread but it might not be until day fourteen that they need to buy some more butter. From this simple example we can see how a trend or a purchasing pattern can be determined.”

In telecoms, another area where there are a great many users and frequent and variable engagement with the services or products provided customers are profiled in terms of their usage.

“Once we have a profile we can see when things go wrong or are not working as they should be. It can help us with fault detection or fraud detection.

“One of the major security risks is SIM-cloning. Where someone can get hold of your SIM card, clone it, and then make calls using your account. Suddenly on your bill you see a whole lot of calls going out to countries you never actually called.”

Knowing what constitutes a normal pattern of behaviour for a given customer allows the system to alert its administrators of unusual or anomalous activity.

In a previous article we gave a brief overview of the social network analysis. We described it as a way of measuring how we are connected. The Data Mining Group at the TSSG have an interest in making this technology more useful to the general business community.

“In social network analytics people are constantly passing messages to each other. From a marketing perspective we can look at who we should be targeting to send our viral message out to for further [propagation.] Who are the biggest distributors of content? It may not necessarily be commercial entities. It could be; bloggers, people with very active Facebook accounts, people with very active Twitter accounts.

“In terms of product, we can start identifying who are the key influencers. Say, I wanted to sell something like running shoes and this guy is a marathon runner and blogs about them. If we know that people listen to him then the running shoe manufacturer can start targeting this guy. ‘Here’s a free pair of running shoes. Tell us what you think of them.’ More importantly, ‘Tell the world what you think of them.’

“If someone is blogging about something we want to understand exactly what they are blogging about and what their opinion was on that subject. Did they like it or not like it? To extent did they like it or not like it?”

These tools end to be used by large organizations such as supermarket chains and telecoms operators. They are expensive in terms of software and hardware and expensive in terms of the people who need to operate these systems.

Eric says, “We decided to see if we could make it applicable to the SME, small and medium sized enterprise, market. We took these techniques and put them into our cloud based system.

“We will host the infrastructure and host the knowledge and techniques that people need and we will put it up as a pay as you go service.”

As yet, this service is not live but the Data Mining and Social Networks Analysis Group are still able to bring their knowledge and expertise to the marketplace. In an arrangement called an Innovation Partnership setup in conjunction with Enterprise Ireland they are working with a Dublin based company called Datafusion International. They write software for law enforcement agencies such as The Gardai – the Irish police service and Homeland Security in the US.

Each of these agencies has a number of data sources. They are able to access data from such things as the land registry to determine the owner of a property or the vehicle registry to determine ownership of a car, van or some other vehicle. Also, they have access to revenue records and court transcripts.

These are all discrete sources of data that the law enforcement agencies but the only problem is that they are all separately housed in their own departments such as the departments of justice, the departments of transport, the ports authorities and so on. However, there is no linkup between them.

Eric explains, “ What they said to us was, ‘We have all this data. We would like to try and link people together. We would like to see a social network map of everybody in the system.

“So, we took all that data and started discerning relationships between them. If two people had the same address we would put a link between them. If they came in on the same flight we would also be able to indicate that there was a link between them

“We are using a product called Gate from the University of Sheffield. It is a term extraction engine. We can look at any kind of news article or any piece of free text and it will parse that text. It will tokenize it and break it up into different parts of speech in terms of what’s a noun and what’s a verb. But more importantly it will identify the names mentioned in the article and who are they mentioned in relation with.”

“We don’t used linked data technology as yet but we do use fuzzy logic. The software is designed to be used by trained individuals within the various law enforcement agencies. Although the program can identify different persons or the same person in different places there will be a human presence involved in the process of checking and verifying identities.”

IMS: How Telecoms is Becoming More Like the Internet

Next Generation Network (NGN) technology is a term that refers to the transition from the traditional technical organisation of telecoms services to one that is based on IP, the Internet Protocol. The TSSG who are based in the Waterford Institute of Technology in Ireland have been researching and prototyping the technology that is going towards building the telecoms architecture of the immediate future. This new type of telecoms structure is referred to as IMS, the IP Multimedia System.

The older system, Signaling System 7, (SS7) was a well standardized, elaborate, complex set of protocols for building telephony functions such as carrier pre-select services, computer-telephony integration and pre-pay. But the application and service model had some big weaknesses.

As can be seen in the main picture, it was vertically integrated so a developer or innovator was limited to building a specific application on top of a specific subscriber data layer with specific media functions and a specific network interface. As a consequence, there were lots of different protocol variants for each kind of application.

Shane Dempsey is an NGN architect at the TSSG and he explains further, “ It meant that the only people who could run a telecommunication service were network operators working with system integrators.

“It was a massive system integrations exercise because you had to know, for example, lots of different variants for the protocol for a particular equipment vendor for this network operator who has a speciality for this equipment and so on. The question becomes, “How do I make this work?” This led to really expensive development life-cycles.

“IMS is not child’s play but it is a lot less hassle. Because it’s a lot less hassle there are API layers being built on top of it.”

IMS has a horizontal model for its architecture as opposed to a vertical one. This allows for a common database for subscriber data and common media function capabilities. Telecoms architecture starts to look a lot more like internet architecture.

One of the many reasons for shifting to this new telecoms structure was a realization that the success of web based applications, particularly those based on social networks, on the internet implied that there were similar opportunities to be exploited in the area of mobile technology.

Shane points out that, “Previously, telecoms vendors didn’t believe that they needed additional ways of storing information like the contacts that you have, your directory of friends, the presence that you have or your dynamic information like your location. It didn’t really occur to them that you needed that.”

However, creating that functionality in SS7 was difficult because of the inherent complexities. However, the move to IMS is not necessarily straightforward.

When you move to a mobile internet it becomes necessary to move to a packet based network. Once you are doing that you might as well have IP switching in the core.

Shane goes on to say, “IP in the core network isn’t a huge deal because the internet is IP at the core. But pushing IP out out into the network is a big deal because previously it was based on time slot technologies. If you are making calls, voice is time slot orientated. [By means of Time Division Multiplexing — TDM.] So moving to IP is a major effort in terms of standardization.

“Packet switching is a kind of a colloquialism that internet scientists use. The packets aren’t of a fixed size but the data can be divided up into packets and you don’t necessarily get the same throughput at every second. So you can get a voice traffic coming through plus internet traffic where people are sharing all sorts of files, documents, audio, video, etc., which are being sent over the same connection.”

By having foundation layers that are common to all parts of the system a great many applications of which some are in some form of existence today become easier to build and easier to deploy.

For businesses, for example, it will be easier to have:

  • Corporate Directory: You can have your own business contacts on your mobile phone but it is now possible to access your companies own directory if it is active.
  • CRM: Applications will be easier to build. It will be much easier to be able to see who is on or off the grid and where they are.
  • Communication Log: For corporate audit services.

For more general use the Rich Communication Suite offers functions on our mobile handsets that we are familiar with from the internet such as:

  • Calls enriched with multimedia sharing.
  • Video call and conferencing.
  • Hi Definition quality voice calls.
  • Enhanced messaging.
  • Mobile and desktop convergence: All the operators are making web service APIs available for the IMS platforms. This will allow third party developers so build applications that can set up conference calls, pull presence information and pull location information and so on.

As Shane states, “We’ll effectively be using internet communications everywhere.”

Shane has a slideset that you may view for further information.

Zolk C: Using Mobile Pervasive Services to Enhance User Experience

Zolk C is a company that provides interpretative guides by means of handheld devices for exhibitions, museums and tour sites. It can be used wherever there is a need to enhance a visitor’s experience to a given venue. Zolk C was spun out from the Telecommunications Software and Systems Group (TSSG) in 2008 and through an ongoing innovation partnership the TSSG is driving the Zolk C technology.

John McGovern is a researcher in the area of mobile pervasive services and is Head of Technical Projects at Zolk C.

Pervasive services allow services to be seamlessly available anywhere at anytime and in any format. Pervasive is defined or utilized under a number of themes:

  • Location
  • Context: Which can be defined in three ways:
    • Where the user is
    • Who the user is with
    • What resources are available to the user
  • Sensors
  • Self-learning: Context definition and context interaction based on the ability to self-learn.

Up until recently if you went to a museum or a tourist site you could be provided with some sort of device that could only give you some audio to help guide you around the location. What a user would expect from an interpretative tour running on a mobile pervasive service would be far richer and far more extensive than just simple audio.

The National Trust for Scotland wanted revamp the visitor experience to one of their major sites of national importance, the place where the Battle of Culloden took place. They wanted to mark out locations on the battlefield that were of special interest. However for reasons of sensitivity and aesthetics they didn’t want to clutter up the site with placards and signs.

Here is a video of the technological solution to this brief that Zolk C were able to provide:

John explains further, “Context is one of the key drivers behind pervasive services. Every action and interaction that the user has on the device and with the device is monitored and logged and is being fed into the engine. So we can use this to profile and model what users are doing and what users aren’t doing.

“In that engine as well we have built a positioning algorithm that allows us to fine grain positioning indoors. We are able to take multiple sensors and augment the location information that we are able to get from that and provide a more accurate pinpoint of where you are.

“What we are then able to do for Zolk C is enable them to layer the content and rich media http://and%20image%20files%20for%20example on top of that positional information. They can then provide a bespoke interface for their client which coupled with our location engine is a really powerful tool.

“From that we can predict things. If a user has gone through a museum and has spent the morning looking at the armoury section and as a consequence missed something else in the exhibit we can raise an alert and say something like, “Did you know there was another armoury section behind door B?” for instance. We are able to tailor the experience to individuals. This is real data in real-time that would be relevant to the tour provider.”

A WiFi framework has been added which gives us the benefit of real-time communications. Previously to upgrade a device it would need to be plugged into a PC and synced. Using WiFi all the devices can be upgraded simultaneously in about twenty minutes if they are all switched on and working.

The ability to communicate leads to the possibility of networks forming and from networks communities can form. John explains, “A big thing that is coming down the road is the ability for tour operators or exhibitors to add communities and by allowing users to think they are part of a community it really increases the traction to the website.

“If you were at Culloden say, and you took pictures of your family you would be able to load them into the Culloden community site and then you can share those pictures with other communities that you may be part of such as friends or co-workers. We have been able to allow them to do that quite easily.

“We can do device to device communications and device to server communications as well. For example, if you were to spot a deer on the lawn on your tour you could broadcast out to other devices, “Come look, there’s a deer on the lawn.”

Mobile pervasive services making use of information derived from context – who the user is, who the user is with and what resources are available to the user – will become a tradable commodity for service providers going forward.

As John points out, “To be able to take the relevant data in terms of context, provided targeted advertising based on that content directly to the users will definitely be worth a lot of money.”