Inside Data 66

by Graham Keitch

The Internet of Things will need new tools and technologies for handling data, as Graham Keitch explains.

HardCopy Issue: 66 | Published: June 1, 2015

Some 30 years ago, many large organisations were creating new IT departments to oversee the introduction of desktop computers. They were widely known as Information Centres but in reality were more concerned with the automation of standalone office processes. The concept of putting information at the hub of things was nevertheless a worthy objective and since then, data has become central to nearly every office system. In the business world most of it is transactional and the idea of deriving information from it is a more recent trend made possible by Business Intelligence tools.

Our systems are increasingly making use of contextual data to inform, influence or support a transaction or process, but not necessarily as a mandatory component. For example, the short term forecast for a seaside resort isn’t a required value for a train ticket transaction, but the availability of such data is useful to both the vendor and consumer. The past decade has brought us technologies that have enabled both websites and applications to display more of this type of information. However the Internet of Things (IoT) will take this to a whole new level, massively increasing the amount of available information, with sensors distributed throughout the environment that is relevant to a particular business feeding it with a constant stream of data. Some of this will be used for traditional transactional purposes, but a great deal more will be used for decision support, with the potential to benefit both the business and its customers.

ERStudio Data Architect XE6 screenshot

Embarcadero ER/Studio Data Architect in action.

The potential impact of IoT in terms of data volume is mind blowing, and participating organisations will need to focus on data that’s easy to access in real time and provides maximum benefit for the business and its customers. Each organisation or industry group will need to identify the information that can effectively transform its business, and then work out how to integrate this into their processes and systems. Existing systems probably have this partly covered already, so for many it’s more a case of extending what’s already in place.

To this end, Microsoft is promoting the idea of creating the Internet of Your Things to help customers identify what is important to their business and build on their existing hardware and software infrastructure. Microsoft offers a framework that allows the data to be marshalled and processed automatically using filters, rules, triggers and other means. The volume and complexity of data that changes hands in a typical IoT conversation is much greater than most conventional systems are geared up to handle, but the flexible and scalable nature of the cloud solves many of the problems associated with handling Big Data. Microsoft Azure Internet of Things provide monitoring and analytical tools that will enable users to evolve their systems to accommodate IoT. The various services are fairly self explanatory and include Azure Event Hubs, Azure DocumentDB, Azure Stream Analytics, Azure Notification Hubs, Azure Machine Learning and Microsoft Power BI, to name a few.

Meanwhile data giants IBM and Oracle have their sights on the manufacturing sector, which will be a big adopter of IoT. Industrial processes will have sensors within the machinery that creates things, and within the things they create. IBM has pledged some $3 billion over four years for an IoT business unit based on an open platform cloud solution.

Industrial automation is also of great interest to Oracle on account of Java’s presence across a broad spectrum of devices. Many of Oracle’s systems are built for the Java J2EE platform and many are specifically tailored for industry verticals such as health, financial and manufacturing sectors.
Database technology is evolving to accommodate the diverse and complex nature of Big Data. In-memory solutions now provide lightning speed access to frequently used parts of the database, while column store and improved search and retrieval methods are becoming more commonplace to help with the analytical side of things. The expansion and multiplicity of database technologies brings its own set of problems. Embarcadero has focused on providing tools that help with this, such as its ER/Studio which addresses the requirements of data architects who need to work with a wide range of database platforms, including some of the newer types associated with Big Data and IoT such as Hadoop and MongoDB.

Oracle is also helping to bridge the platforms with several toolsets released recently for their Big Data Appliance. These are largely aimed at Hadoop which uses the MapReduce programming model, a model unfamiliar to traditional RDBMS professionals. It is only a matter of time before such technologies become more widely available and commonplace as IoT takes hold.