Enterprise IoT: Why Securing IoT Devices Needs to Be the Number One Priority

The number of IoT devices around the world keeps on growing. Globally, there are now more than 26 billion connected devices, according to research from Statista – up from 15 billion in 2015 – with the number projected to rise to over 75 billion by 2025. In 2018, the global IoT market stood at about $164 billion, and is expected to increase almost tenfold over the next six years, reaching around $1.6 trillion by 2025. The popularity of IoT technology is drastically transforming how society functions and how businesses are run. Be it manufacturing, transportation, telecoms, logistics, retail, insurance, finance or healthcare, the vast proliferation of IoT technology is on course to disrupt practically every industry on the planet. However, as more and more IoT devices are deployed across the enterprise, new challenges emerge for developers – and securing IoT systems is chief among them. 

(Image source: statista.com)

IoT in the Enterprise

Although much media attention surrounding IoT has focused on consumer products – smart speakers, thermostats, lights, door locks, fridges, etc. – some of the most exciting IoT innovations are coming from the business sector. The combination of sensor data and sophisticated analytical algorithms is allowing companies in a broad range of industries to streamline operations, increase productivity, develop leading-edge products, and solve age-old business problems. Consider the performance of all types of equipment and machinery – from jet engines to HVAC systems – being constantly monitored with sensors to predict the point of failure and avoid downtime automatically. Or how about driver speed behavior information being shared in real-time with an insurer – or geolocation beacons pushing targeted advertisements and marketing messages to customers when they are in or near a store. Usage of data from IoT sensors and controllers for better decision making – combined with automation for better efficiencies – is enormously valuable. As such, more and more businesses are getting on board with the IoT revolution.  

84% of the 700+ executives from a range of sectors interviewed for a Forbes Insights survey last year said that their IoT networks had grown over the previous three years. What’s more, 60% said that their organizations were expanding or transforming with new lines of business thanks to IoT initiatives, and 36% were considering potential new business directions. 63% were already delivering new or updated services directly to customers using the Internet of Things.   

By industry, nearly six in ten (58%) executives in the financial services sector reported having well-developed IoT initiatives, as did 55% of those in healthcare, 53% in communications, 51% in manufacturing, and 51% retail. 

(Image source: info.forbes.com)

The survey also showed that leveraging IoT as part of a business transformation strategy increases profitability. 75% of leading enterprises credited IoT with delivering increased revenue. 45% reported that the Internet of Things had helped boost profits by up to 5% over the previous year, another 41% said that it had boosted profits by 5% to 15%, and 14% had experienced a profit boost of more than 15% – and all anticipated IoT to have a significant profit-boosting impact in the year ahead. 

(Image source: info.forbes.com)

However, key to profitability and business success with IoT technology is security. Indeed, along with developing/maintaining appropriate algorithms/software and speed of rollout, securing IoT was as one of the three top IoT challenges cited by the executives. How do organizations ensure the integrity of their IoT data? How do they ensure that the various operational systems being automated with the technology are controlled as intended? These are questions that need to be answered, for a lot of hard IoT security lessons have been learned in recent years. 

Securing IoT in the Enterprise – An Ongoing Challenge 

As the number of connected IoT devices in the enterprise increases, new threats emerge. Distributed Denial of Service (DDoS) attacks provide a number of high-profile examples. Here, vulnerable connected devices are hijacked by hackers and used to send repeated and frequent queries that bombard the Domain Name Server (DNS), causing it to crash. For instance, the Mirai botnet in 2016 shut down major internet providers in North America and Europe by taking over hundreds of thousands of IoT devices – mainly IP security cameras, network video recorders and digital video recorders – and using them for a DDoS attack.

Mirai was able to take advantage of these insecure IoT devices in a simple but clever way – by scanning big blocks of the internet for open Telnet ports, then attempting to log in using 61 username/password combinations that are frequently used as the default for these devices and never changed. In this way, it was able to amass an army of compromised CCTV cameras and routers to launch the attack. Perhaps most concerning of all, however, is that the Mirai botnet source code still exists “in the wild”, meaning that anyone can use it to attempt to launch a DDoS attack against any business with IoT implementations – and many cybercriminals have done just that. 

Another example involves a US university in 2017, which suddenly found over 5,000 on-campus IoT devices – including vending machines and light bulbs – making hundreds of DNS queries every 15 minutes to sub-domains related to seafood. The botnet spread across the network and launched a DDoS attack, resulting in slow or completely inaccessible connectivity across the campus. Again, it was weak default passwords that left these devices vulnerable. 

One of the main problems with IoT devices being used in workplace environments is that many are not inherently secure. Part of the issue is that there are literally thousands of individual IoT manufacturing companies – many of which started life in the consumer market – with very little consistency between them. What this means is that each IoT device that ends up in the workplace – be it a lightbulb, vending machine, or CCTV camera – will likely have its own operating system. Each will likely have its own security setup as well – which will be different from every other connected thing in the office – and a different online dashboard from which it is operated. Many of these devices are also shipped with default usernames and passwords, making them inherently hackable. The manufacturers, meanwhile, take little or no responsibility if any of these devices are hacked, meaning the onus for securing IoT in all its forms falls entirely upon an organization’s IT department – and too often no one is assigned to this critical task. 

What makes it so critical? Well, thanks to Shodan – a specialized search engine that lets users find information about IoT devices (including computers, cameras, printers, routers and servers) – anyone, including hackers, can locate devices that use default logins with a simple web search. However, what’s good for hackers can be seen as being good for enterprises, too. Though the very existence of Shodan is perhaps scary, IT professionals should be using the search engine proactively as a security tool to find out if any information about devices on the company’s network is publicly accessible. After that, securing IoT is down to them. 

(Image source: shodan.io)

Another issue that renders securing IoT devices absolutely essential is the threat of spy tech and ransomware. Many IoT devices incorporate microphones, cameras, and the means to record their location, leaving organizations vulnerable to sensitive data being stolen or company secrets being exposed and held to ransom. Things like IoT-enabled building management systems can also be left open to surveillance or meddling from malicious third parties. A hacker could, for instance, lock all the doors in an office building or cut all the power. As an example, researchers at Def Con demonstrated how such a system can be targeted with ransomware by gaining full remote control of a connected thermostat. In a real-life scenario, such an attack could result in an office becoming uninhabitable, opening an organization up to ransom demands to regain control. 

In short, with the ever-increasing number of IoT devices an organization relies upon, the attack surface grows in kind – as does the unpredictability with regards to how hackers may seek to exploit them.

The Huge Costs of Not Securing IoT 

Securing IoT should be a top priority for practically all businesses for the simple reason that practically all businesses are invested in IoT. In fact, according to recent research from DigiCert – State of IoT Security Survey 2018 – 92% of organizations report that IoT will be important to their business by 2020. The executives interviewed cited increased operational efficiency, improving the customer experience, growing revenue, and achieving business agility as the top four goals of their IoT investments. 

(Image source: digicert.com)

However, securing IoT remains the biggest concern for 82% of these organizations. And it’s no wonder – a full 100% of bottom-tier enterprises (i.e. enterprises that are having the most problems with IoT security issues) had experienced at least one IoT security incident in 2018. Of these, 25% reported related losses of at least $34 million over the previous two years. 

(Image source: digicert.com)

These bottom-tier companies are much more likely to experience data breaches, malware/ransomware attacks, unauthorized access/control of IoT devices, and IoT-based DDoS attacks than top-tier companies (i.e. companies that are best prepared in terms of IoT security). So – what are top-tier companies doing differently? Well, DigiCert found that they all had five key behaviors in common – they were all ensuring device data integrity (authentication), implementing scalable security, securing over-the-air updates, utilizing software-based key storage, and encrypting all sensitive data. 

Speaking to Security Now, Mike Nelson, Vice President of IoT Security at DigiCert, comments on the findings: “The security challenges presented by IoT are similar to the many IT and internet security challenges industries have faced for years. Encryption of data in transit, authentication of connections, ensuring the integrity of data – these challenges are not new. However, in the IoT ecosystem these challenges require new and unique ways of thinking to make sure the way you’re solving those challenges works. Regarding evolution of security challenges, the biggest challenge is simply the scale and the magnitude of growth. Having scalable solutions is going to be critical.”

(Image source: digicert.com)

Final Thoughts

IoT has the potential to open up many new opportunities for growth and agility within the enterprise. However, securing IoT devices remains absolutely crucial. Organizations need to take the necessary steps to ensure that their devices and data are adequately protected from end to end. This will involve conducting a thorough review of the current IoT environment, evaluating the risks, and prioritizing primary security concerns that need to be addressed. Strong and unique passwords must also be mandatory for every device. Firmware must be constantly updated, and only secure web, mobile and cloud applications with strong encryption and data protection features must be used. All data must be encrypted – both at rest and in transit – with end-to-end encryption made a product requirement for all devices that connect. It’s also important that this data is secured and processed securely after it has been transmitted across the network. Device updates must be monitored and managed around the clock and around the calendar. Finally, the security framework and architecture must be scalable to support IoT deployments both now and in the future. As such, working with third parties that have the resources and expertise to manage scaling IoT security programs will be invaluable. 

Vinnter serves as an enabler for developing new business and service strategies for traditional industries, as well as fresh start-ups. We help companies stay competitive through embedded software development, communications and connectivity, hardware design, cloud services and secure IoT platforms. Our skilled and experienced teams of developers, engineers and business consultants will help you redefine your organization for the digital age, creating new, highly-secure connected products and digital services that meet the evolving demands of your customers. Get in touch to find out more. 

Making sense of IoT connectivity protocols

IoT, abbreviated from the Internet of Things, refers to a connected system of devices, machines, objects, animals or people with the ability to autonomously communicate across a common network without the need for human-to-human or human-to-computer interaction. This relatively recent innovation is already revolutionizing a lot of sectors with its ability to add connected intelligence to almost everything, including smart homes, smart automobiles, smart factories, smart buildings, smart cities, smart power grids, smart healthcare, smart agriculture and smart livestock farming, to name just a few.  

IoT is still a nascent innovation but it has an evolutionary trail that leads back, as per the ITU (International Telecommunications Union), to the early years of the last century. 

A brief history of telemetry, M2M and IoT

A 2016 Intro to Internet of Things presentation from the ITU charts the legacy of the modern IoT revolution back to 1912, when an electric utility in Chicago developed a telemetry system to monitor electrical loads in the power grid using the city’s telephone lines. The next big milestone, wireless telemetry using radio transmissions rather than landline infrastructure, was passed in 1930 and used to monitor weather conditions from balloons. Then came aerospace telemetry with the launch of Sputnik in 1957, an event widely considered the precursor to today’s modern satellite communications era.     

At this point M2M as we know it was still some years away, awaiting two landmark breakthroughs across three decades apart to propel it into the mainstream. 

The first breakthrough came in 1968 when Greek-American inventor and businessman Theodore G. Paraskevakos came up with the idea of combining telephony and computing, the theoretical foundation to modern M2M technologies, while working on a caller line ID system. The second happened in 1995 with Siemens launching the M1, a GSM data module that allowed machines to communicate over wireless networks. From thereon, regular improvements in wireless connectivity, and the Federal Communications Commission’s advocacy for the use of spectrum-efficient digital networks over analog networks, paved the way for more widespread adoption of cellular M2M technologies.  

IoT is the most recent mutation in this extended evolutionary chain of autonomous machine-to-machine connectivity. However, though both approaches share the same foundational principles, there are some marked differences as shown in the chart below. 

SOURCE: https://ipwithease.com/internet-of-things-vs-machine-to-machine-iot-vs-m2m/

Perhaps one of the most significant distinctions between M2M and IoT is in terms of ambition and scope. Current estimates indicate anywhere between 22 and 25 billion connected IoT devices by 2025. But before we have even tapped into the potential of networking billions of physical objects, industry aspirations are visualizing an Internet of Everything, where not just objects and devices but everything, including people, process data and things are connected into one seamless and intelligent ecosystem. 

But whatever the breadth of the ambitions for IoT, the availability of quality connectivity options will eventually determine the value of the outcome. Today there are an overwhelming range of connectivity technologies on offer with a range of capabilities suited for different IoT applications. 

Classifying IoT connectivity technologies

When it comes IoT connectivity, technology is constantly changing, with existing options being constantly updated and upgraded and new alternatives being continually introduced. And given the diversity of the IoT applications market, available solutions can be classified across a complex matrix of characteristics including range, bandwidth, power consumption, cost, ease of implementation, security, etc. But it is possible to classify these solutions using a simple 4 part taxonomy, namely:

  1. Classic connectivity solutions, comprising traditional short-range wireless solutions 
  2. Non-cellular IoT, proprietary technologies deployed industry players/consortia.
  3. Cellular IoT, standardized technologies that operate in the licensed spectrum.
  4. Satellite IoT, for areas that cannot be covered by any of the above. 
SOURCE: https://www.counterpointresearch.com/lpwans-will-co-exist-no-war-brewing-between-cellular-non-cellular/

Both cellular and non-cellular IoT technologies fall under the broad, and rather self-explanatory, category of low-power wide-area networks or LPWANs. While the former is a standardized technology provided in the licensed spectrum by mobile network operators, the latter refers to private proprietary solutions operating in unlicensed radio frequencies. Both solutions, however, are purpose-designed for IoT and are capable of transmitting small packets of data across long distances, over an extended period, with very limited resource usage. The forecast for LPWAN technologies is that they will cover 100 percent of the global population by 2022. 

1. Classic Connectivity: 

There are a range of technologies that fall under this category, including Wi-Fi, Bluetooth and Bluetooth Low Energy, NFC, RFID, and mesh technologies such as ZigBee, Thread and Z-Wave. As mentioned earlier, these are all short-range solutions that are ideal for bounded environments such as smart homes for example. But if short-range seems like a limitation, these solutions make up for it by enabling high bandwidth transmissions at low power consumption rates. Most of these solutions may not be designed specifically for IoT. But as long as the requirement does not include long-distance data transmission, they could still serve as a crucial hub in a larger hybrid IoT environment. 

 2. Non-Cellular IoT: 

There are currently two popular LPWAN solutions, LoRaWAN and Sigfox, in this space.  

  • LoRaWAN is an open IoT protocol for secure, carrier-grade LPWAN connectivity. It is backed by the LoRa Alliance, a global nonprofit association of telecoms, technology blue chips, hardware manufacturers, systems integrators, and sensor and semiconductor majors. 

The protocol wireless connects battery operated ‘things’ to the internet, enabling low-cost, low-power, mobile and secure bi-directional communication. The solution can also scale from a single gateway installation to a global network of devices across IoT, M2M and other large-scale smart applications. Though the LoRaWAN protocol defines the technical implementation, it does not place any restrictions on type of deployment, giving customers the flexibility to innovate. One of the arguments challenging the technology’s open-standard credential has focused on implementation being tied to chips from LoRa Alliance member Semtech. However, other suppliers have recently announced an interest in adopting LoRa radio technology. 

SOURCE: https://lora-alliance.org

LoRaWAN already has a massive global footprint with over a 100 network operators having deployed its networks across the world by the end of 2018. The alliance also announced that it has tripled the number of end-devices connecting to its networks.    

  • Sigfox was one of the first companies to create a dedicated IoT network that used Ultra Narrow Band modulation in the 200 kHz public band to exchange radio messages over the air. The company’s stated ambition is to mitigate the cost and complexity of IoT adoption by eliminating the need for sensor batteries and reducing the dependence on expensive silicon modules. 

The company’s proprietary protocol is designed for IoT applications that transmit data in infrequent short bursts across long distances, while ensuring low connectivity costs, and reducing energy consumption. It works with several large manufacturers such as STMicroelectronics, Atmel, and Texas Instruments for its endpoint modules in order to ensure the lowest cost for its customers.

The Sigfox network is currently operational in 60 countries, covering an estimated 1 billion people worldwide, connecting 6.2 million devices and transmitting 13 million messages each day. Sigfox has also teamed up with satellite operator Eutelsat to launch a satellite that will enable global coverage.  

There are a few other players, like Link Labs and Weightless SIG, offering their own LPWAN technologies. But LoRaWAN and Sigfox dominate the market, accounting for nearly two-thirds of low-power wide-area networks deployments. 

There is, however, a significant challenge emerging from their counterparts in cellular IoT with technologies like NB-IoT and LTE-M. 

3. Cellular/Mobile IoT: 

Proprietary technologies operating in the unlicensed spectrum may seem to have the market cornered, but cellular/mobile IoT is rapidly catching up.  Earlier this year the GSMA announced the availability of mobile low-power wide-area IoT networks in 50 markets around the world with a total of 114 launches as of May 2019. 

SOURCE: https://www.gsma.com/iot/deployment-map/

These launches include both LTE-M (LTE Cat-M/eMTC) and NarrowBand IoT (NB-IoT/LTE Cat-NB), a set of complementary, IoT-optimized cellular standards developed by the 3GPP (3rd Generation Partnership Project). Both these Mobile IoT networks are ideal for low-cost, low-power, long-range IoT applications and together they are positioned to address the entire spectrum of LPWAN needs across a range of industries and use cases. Operators have a choice of cellular technologies to ensure that they can provide clearly differentiated IoT services based on the market dynamics in their regions. And both these technologies can coexist with 2G, 3G and 4G networks. There are, however, some key distinctions between the two, stemming primarily from the focus on covering as wide a range of IoT applications as possible.  

  • LTE-M (Long Term Evolution for Machines) enables the reuse of existing LTE mobile network infrastructure base while reducing device complexity, lowering power consumption and extending coverage, including better indoor penetration. LTE-M standards are designed to deliver a 10X improvement in battery life and bring down module costs by as much as 50 percent when compared to standard LTE devices. 

One significant development in the LTE-IoT market has been the launch of the MulteFire Alliance, a global consortium that wants to extend the benefits of LTE to the unlicensed spectrum. The group’s MulteFire LTE technology is built on 3GPP standards and will continue to evolve with those standards but operates in the unlicensed or shared spectrum. The objective is to blend the benefits of LTE with ease of deployment. Key features of the latest MulteFire Release 1.1 specifications include optimization for Industrial IoT, support for eMTC-U and NB-IoT-U, and access to new spectrum bands.  

  • NarrowBand IoT or NB-IoT is based on narrow band radio technology and is targeted at low-complexity low-performance cost-sensitive applications in the Massive IoT segment.  The technology is relatively easier to design and deploy as it is not as complex as traditional cellular modules. In addition, it enhances network capacity as well as efficiency to support a massive number of low throughput connections over just 200khz of spectrum. NB-IoT can also be significantly more economical to deploy compared with other technologies as it eliminates the need for gateways by communicating directly with the primary server.

Both these technologies are already 5G-ready. They will continue to evolve to support 5G use cases and coexist with other 3GPP 5G technologies.

The race for 5G deployments has already begun in earnest. Following the launch of 5G services in South Korea and the US earlier this year, 16 more markets are expected to join this as yet exclusive club in 2018. 

The emergence of 5G, the fifth generation of wireless mobile communications, will no doubt have a major impact on how these services are delivered. These fifth generation networks, with their promise of higher capacity, lower latency and energy/cost savings, have the potential to support more innovative bandwidth-intensive applications and massive machine-type communications (mMTC). 

4. Satellite IoT:

This is ideal for remote areas that are not covered by cellular service. Though that may seem like a niche market, some reports indicate that there may be as many as 1,600 satellites dedicated to IoT applications over the next 5 years. Satellite communications company Iridium has partnered with Amazon Web Services to launch Iridium CloudConnect, the first satellite-powered cloud-based solution for Internet of Things (IoT) applications. 

All of which brings up the question, which IoT protocol is right for you? Every technology discussed here has its USPs and its limitations. Every IoT application has its own requirements in terms of data rate, latency, deployment cost etc. A protocol that works perfectly well for a particular use case may prove to be completely inadequate for another. 

So there is no one-size-fits-all protocol that can be prescribed by application or even by industry. As a matter of fact, sticking to just one technology standard doesn’t make sense in many Internet of Things (IoT) implementations, and that’s according to Sigfox.

Choosing a cloud service provider in an evolving marketplace

Last year, Gartner axed 14 cloud vendors from its Magic Quadrant for Cloud IaaS, choosing to focus only on global vendors currently offering, or developing, hyperscale integrated IaaS and PaaS offerings. This bit of spring cleaning left behind a more manageable roster of six companies classified into two distinct segments: the Leaders, comprising Amazon Web Services, Microsoft, and Google, and the Niche Players, represented by Alibaba, Oracle, and IBM.

SOURCE: https://www.bmc.com/blogs/gartner-magic-quadrant-cloud-iaas/

Even within this simplified league table of three, Amazon continues to be the clear leader in terms of revenue and market share. 

SOURCE: https://www.parkmycloud.com/blog/aws-vs-azure-vs-google-cloud-market-share/
SOURCE: https://www.parkmycloud.com/blog/aws-vs-azure-vs-google-cloud-market-share/

Now AWS may dominate the cloud market, but Microsoft and Google are growing much faster.

According to Q1 2019 figures, these challengers are growing, respectively, at 75 and 83 percent against a comparatively middling 41 percent growth for AWS.

AWS market share has also remained stagnant at 33 percent between Q1 2018 and 2019.

Additionally, a Gartner scorecard that evaluates public IaaS cloud providers across 263 required, preferred and optional criteria found that Azure has pulled away from AWS in the required criteria for the first time since the scoring started.

SOURCE: https://info.flexerasoftware.com/SLO-WP-State-of-the-Cloud-2019
SOURCE: https://blogs.gartner.com/elias-khnaser/2018/08/01/just-published-new-scorecards-for-aws-azure-gcp-and-oci-cloud-iaas/

Going multicloud and hybrid

But even as the cloud IaaS market coalesces into a “Big Three vs Others” comparison, RightScale’s 2019 State of the Cloud survey revealed that businesses are combining public and private clouds in a hybrid multi-cloud strategy that leverages almost 5 clouds on average.

Another study from Kentik found that the most common cloud combination was AWS and Azure, with the AWS-Google Cloud combination trailing not far behind.

Though public cloud remains the top priority across enterprises, the number of companies deploying a hybrid public plus private cloud strategy is increasing.

At the same time, companies with a multicloud strategy, combining multiple public OR private clouds, has decreased. 

All this would suggest that the cloud market is not a zero-sum play. Businesses are using a combination of cloud providers, including the niche players, to design solutions that deliver the best outcomes. And cloud providers will have to take this preference for hybrid clouds into consideration while developing solutions for their customers. 

More importantly, none of this makes it any easier for customers to narrow down the right vendors for their workloads given the absence of any common framework for assessment. But it is possible to define a template defining some key considerations that should drive the choice of cloud providers. 

Choosing a cloud service provider

There are several factors that can influence a company’s choice of cloud provider including the platform’s choice of technology and architecture, data security, compliance and governance policies, interoperability, portability and migration support, services development roadmap, etc.

The Cloud Industry Forum, a UK-based not-for-profit organization promoting cloud adoption, has a fairly comprehensive list of 8 criteria for selecting the right cloud service provider. At Vinnter we believe that there are 4 important aspects to picking a cloud provider:

Location proximity: There are two reasons to ensure cloud service providers have actual operations in the customer’s target market. The first is latency, which some have even referred to as the Achilles heel of cloud adoption.

One study on the global network performance of AWS, Google Cloud and Microsoft Azure found that data center location directly affected network latency with network performance varying across different service providers while connecting across different regions.

This lag can have huge performance implications for many modern business and IoT applications that depend on and expect low latency.    

 The second critical factor is that of data sovereignty. Many countries, including Russia, China, Germany, France, Indonesia and Vietnam, have data residency regulations that require data to be stored in the region. GDPR sovereignty requirements have strict mandates on the collection and processing of EU residents’ data. 

Cloud providers are responding to these mandates of latency and sovereignty by opening up multiple regional data centers. For instance, AWS announced plans to open an infrastructure region in Italy, the company’s sixth in Europe, that would address both low latency and data residency requirements.

In Germany, Microsoft has placed the customer data in its data centers with an independent German data trustee, making it difficult for anyone, including Microsoft or US authorities, to access it without the customer’s permission.  

Transparent pricing: Optimizing cloud costs continues to be a top priority, even among advanced cloud users, with one study estimating wastage at about 35 percent of cloud spends. 

The study also identified four reasons for this wastage: complexity of cloud pricing, a better-safe-than-sorry approach leading to overprovisioning, lack of visibility into cost implications, and lack of adequate tools for optimizing spends.

As cloud providers launch new innovations and pricing models the pricing is only expected to get more complicated with no basis for comparison across services.

In fact, 2018 was predicted to be the year when cloud providers would consolidate and simplify their offerings and pricing structures.  

A transparent pricing model can address almost all the waste factors mentioned earlier. It provides customers with a common and objective basis for comparing and choosing the service that best suits their workloads as well as optimize provisioning to actual demand. 

Accessible documentation and support: Documentation and customer support are critical factors, often the difference between productivity and wastage, in an evolving area such as cloud computing.  Extensive and accessible documentation can make it easier for customers to implement and manage their cloud services more optimally. This has to be backed by 24/7 customer service and dedicated account managers to help customers resolve their cloud service problems and queries.

Both Google Cloud and AWS provide access to comprehensive documentation as well as community forums where customers can address implementation or performance related issues. All top three vendors offer a lower level of support by default but expect customers to pay for anything more. For instance, Google and AWS offer different levels of support at different price points.  

Going forward, the documentation and support offered by a cloud service provider could become a key point of differentiation in customers’ choice of cloud platform. 

Finding the right cloud skills: As more and more businesses move their workloads to the cloud, there is a growing demand for people with the skills to develop, operate and maintain end-services deployed in a cloud environment. This will be a critical factor irrespective of choice of cloud service provider.  

But acquiring the right cloud skills has reportedly become a full-blown crisis with a significant majority of IT managers finding it “somewhat difficult” to find cloud management talent. 

In order to deal with this crisis, Deloitte advises businesses to start with an inventory of cloud computing skills in the company across different areas such as architecture, security, governance, operations and DevOps as well as cloud brand-specific skills. 

The next step is to define the skills related to these same areas that are required to get the company to where it wants to be in terms of cloud technologies. Finally, train, hire and/or replace talent to build a cloud-first approach to technology. 

Today, it is simply not possible to create a like-to-like comparison template of all major service providers that would be relevant to every business looking to select a cloud platform. But there are certain broader themes that apply across services that need to be considered to determine the correct fit for every company’s technology and workload profile, usage pattern, technical maturity and budget.

Rather more importantly, the cloud is not the plug-and-play environment that it promised to be. Even post-adoption every business will need in-house cloud skills to accelerate development and innovation. That perhaps will be the biggest challenge of all.   

The analytical challenges of IoT data

If data is indeed the new oil, then we’re still a long way off from mastering the science of extracting, refining and deploying it as a strategic enterprise asset. That, in short, seems to be the conclusion that emerges from two separate studies. This may come from the analytical challenges that come with IoT data.

The first study from Gartner classifies 87% of businesses as having low BI and analytics maturity. Organizations within this low maturity group are further divided along two levels; a basic level characterized by spreadsheet-based analytics and a higher opportunistic level where data analytics is deployed but piecemeal and without any central leadership and guidance.  

In the second study from New Vantage Partners, a majority of C-suite executives conceded that they were yet to create either a data culture or a data-driven organization. Rather more worryingly, the proportion of companies that self-identified as data-driven seemed to be on the decline.

Companies may not yet be data-driven but the data flow shows no signs of slowing down.

According to IDC, the global datasphere will grow to 175 Zettabytes (ZB) in 2025, up from 23 ZB in 2017. Even as that happens, the consumer share of this data will drop from 47%, in 2017, to 36%, in 2025. This means that the bulk of this data surge will be driven by, what IDC refers to as, the sensorized world or the IoT. 

SOURCE: https://www.seagate.com/files/www-content/our-story/trends/files/idc-seagate-dataage-whitepaper.pdf

Key challenges of IoT data

The surge of IoT data comes with a lot of economic value, estimated at around $11 trillion by 2025. But it also comes with some significant challenges in terms of aggregating data from disparate, distributed sources and applying analytics to extract strategic value. 

The primary challenge of IoT data is its real-time nature. By 2025, 30% of all data will be real-time, with IoT accounting for nearly 95% of it, 20% of all data will be critical and 10% of all data will be hypercritical. Analytics will have to happen in real-time for companies to benefit from these types of data.

Then there is the issue of time series data. This refers to any data that has a time stamp, such as a smart metering service in the IoT space or even stock prices. A company’s IoT infrastructure must be capable of collecting, storing and analyzing huge volumes of time series data. The challenge here is that most conventional databases are not equipped to handle this type of data.

The distributed nature of IoT data, where most of the data is created outside enterprise data centers, also presents its own set of analytics challenges. Chief among them is the need to process at least some of this distributed data, especially hypercritical and critical data, to be processed outside the data center. IoT analytics itself, therefore, will have to become distributed with some analytics logic shifting out of the cloud to the edge. IoT analytics will have to be distributed across devices, edge servers, gateways and central processing environments. In fact, Gartner predicts that half of all large enterprises will integrate edge computing principles into their IoT projects by 2020.    

These are just a few of the characteristics of IoT data that differentiate it from conventional data sets. And traditional data-analytics technologies and capabilities are not designed to handle the volume, variety and complexity of IoT data. Most companies will have to completely revamp their analytics capabilities to include IoT specific capabilities such as streaming analytics, the ability to identify and prioritize between different data types and formats, and edge analytics. 

CSPs take the lead in IoT data analytics

Many companies are turning to cloud-based IoT platforms that offer rich data services alongside their core IoT offerings. Customers are looking for real-time capabilities across data ingestion, storage, processing, and analysis such as for the rich data ingestion, transformation, storage and processing.  Some cloud vendors are even offering their own hardware to enhance the interoperability and performance between IoT devices and data that is processed in the cloud. 

According to a Bain & Company study, CSPs (cloud service providers) are seen as leaders in providing a comprehensive set of tools that address all the IoT data analytics needs of the enterprise. These CSPs, according to the same study, are also playing a key role in lowering barriers to IoT adoption, facilitating simpler implementations and enabling customers to design, deploy and scale new use cases as quickly as possible. 

AWS takes the lead among IoT CSPs

Among the big brand CSPs, Amazon AWS has consistently been ranked as the platform of choice, followed by Microsoft Azure and Google Cloud Platform, in the annual IoT Developer Survey conducted by the Eclipse IoT Working Group. 

SOURCE :https://iot.eclipse.org/resources/iot-developer-survey/iot-developer-survey-2019.pdf

With data collection and analytics remaining a top three concern among developers, Amazon AWS offers arguably the most robust cloud-based IoT analytics solution in the market today.

The AWS IoT Analytics platform is a managed service that eliminates the complexity of operationalizing sophisticated analytics for massive volumes of IoT data. With AWS Lambda, developers also have access to a functional programming model that enables them to build and test IoT applications for both cloud and on-premise deployments. 

In terms of data collection & analytics, Amazon offers two distinct services in the form of AWS IoT Analytics and Kinesis Data Analytics. 

AWS IoT Analytics has the capabilities required for a range of IoT applications with built-in AWS Core support to simplify the setup process. With AWS IoT Analytics, it becomes much easier to cleanse bad data and to enrich data streams with external sources. AWS IoT Analytics allows data scientists access to raw and processed data, the facility to save and retrieve specific subsets of data and flexibility for rule-based routing of data across multiple processes.  

Kinesis Data Analytics is more suited for real-time data ingestion applications, like remote monitoring and process control, that require low latency response times in the range of milliseconds. The service integrates with other AWS tools like Amazon DynamoDB, Amazon Redshift, AWS IoT, Amazon EC2, to streamline the data analytics process. The Kinesis Analytics suite comprises a raft of services including, Kinesis Data Streams, Kinesis Data Firehose, Kinesis Data Analytics and Kinesis video streams. Amazon Kinesis Data Streams enables the continuous capture of large volume real-time data feeds and events of different kinds. Raw data from Kinesis can then be cleaned and process through AWS Lambda or Amazon ECS.  Kinesis Firehose prepares and loads streaming data to S3, Redshift or Elasticsearch for near real-time processing and analytics. 

While Kinesis offers developers more flexibility in development and integration, AWS IoT focuses on simplifying deployment using prebuilt components. It is possible to combine these two solutions to build a comprehensive IoT solution encompassing streaming as well as at-rest data.  

Late last year, Amazon AWS announced the launch of four new capabilities that would make it easier to ingest data from edge devices. AWS IoT SiteWise is a managed service that makes it easy to collect, structure, and search data from industrial equipment at scale. With AWS IoT Events, customers can now easily detect and respond to events from large numbers of IoT sensors and applications. AWS IoT Things Graph enables a no-code approach to IoT development with a visual drag-and-drop interface that allows developers to build IoT applications by simply connecting devices and services and defining their interactions. And finally there was AWS IoT Greengrass Connectors, a service that would enable developers to connect devices to third-party applications, on-premises software, and AWS services through cloud APIs.

Over and above all this, AWS has established a strong partner network of edge-to-cloud service providers and device manufacturers to offer customer the deepest technical and domain expertise required to mitigate the complexity of IoT projects. 

Apart from being a developer favorite, AWS IoT has also built up a client roster of some of the biggest brands in the industry including LG, Bayer, NASA, British Gas and Analog Devices, to name just a few. 

Notwithstanding the challenges of Big Data and analytics, there have been many successful IoT implementation across diverse sectors. Here then are just a couple of success stories of how companies and their IoT partners were able to use the power of big data analytics in IoT. 

IoT data analytics success stories

Bayer & AWS IoT Core: Bayer Crop Science, a division of Bayer, provides a range of products and services that maximize crop production and enable sustainable agriculture for farmers worldwide. The company uses IoT devices on harvesting machines to monitor crop traits that are then manually transmitted, over several days, to its data centers for analysis. The lack of real-time data collection and analytics meant that Bayer could not immediately address any issues with equipment calibration, jamming, or deviations to help with routing plans for subsequent runs. 

Already an AWS customer, Bayer’s IoT team decided to move its data-collection and analysis pipeline to AWS IoT Core. The company built a new IoT pipeline to manage the collection, processing, and analysis of seed-growing data. 

The new solution captures multiple terabytes of data, at an average of one million traits per day during planting or harvest season, from the company’s research fields across the globe. This data is delivered to Bayer’s data analysts in near real-time. The AWS IoT solution also provides a robust edge processing and analytics framework that can be scaled across a variety of IoT use cases and IoT initiatives. 

Bayer is now planning to use AWS IoT Analytics to capture and analyze drone imagery and data from environmental IoT sensors in greenhouses for monitoring and optimizing growing conditions.

Microsoft Azure IoT Hub & ActionPoint: Many manufacturers still use paper checklists, manual processes, human observation and legacy closed-loop technologies to monitor and maintain their equipment. Even in the case of modernized plants, manufacturers often did not have the right sensors in place to provide all the data required, or they had no analytics solutions to analyze the sensor data.  

Custom software developer ActionPoint partnered with Microsoft and Dell Technologies to develop IoT-PREDICT, an industrial IoT solution for predictive maintenance that incorporates machine learning, data analytics, and other advanced capabilities. The solution is powered by the Microsoft Windows 10 IoT Enterprise operating system running on Dell Edge Gateway hardware, and combined with the Microsoft Azure tool set to provide state-of-the-art edge computing. 

The combination of Windows 10 IoT Enterprise and Azure delivers a highly effective IoT solution that customers can deploy in minutes. It also gives the IoT-PREDICT solution the flexibility and scalability that allows manufacturers to start small with IoT and grow at their own pace.

IoT-PREDICT helps manufacturers quickly reduce downtime, lower costs, and increase the overall efficiency of their equipment and operations. It helps maximize the impact of manufacturer data by using the Microsoft Azure IoT Hub to gather data and make it available to several Azure services, including Azure Time Series Insights, Azure Stream Analytics. Manufacturers can now explore the data using Time Series Insights, or use Stream Analytics to take action with the data by setting up queries and alerts based on various performance thresholds.

IoT data analytics has certain unique characteristics and challenges that cannot be addressed by conventional analytics technologies and capabilities. But like in any analytics operation, the primary objective remains the same: to generate actionable insights that can enable positive business value. It is not just about choice or sensor or connectivity protocol or CSP. It has to be about ensuring the integrity of what McKinsey defines as the insights value chain. In order to ensure that every IoT project leads to demonstrable business value, organizations have to ensure the integrity of the entire insights value chain. 

Pentair & AWS: Pentair is a water treatment company that offers a comprehensive range of smart, sustainable water solutions to homes, businesses and industries around the world. The company relies on connected systems to monitor and manage its product installations, most of which are in remote locations. Traditionally, the company took the custom building route to develop its connected systems, which came with its own set of disadvantages. 

Pentair needed a powerful, flexible IoT platform, with high availability and scalability and a high degree of reuse across all lines of its business. Pentair also wanted a comprehensive solution that covered everything from IoT data ingestion, to analysis and visualization. 

The company teamed up with AWS Partner Network (APN) Advanced Technology Partner and IoT Competency Partner Bright Wolf to evaluate potential technology providers including Amazon, GE, IBM, Microsoft and others against a set of platform characteristics. This included data ingestion, infrastructure controls, deployment options, machine learning and visualization tools, development tools and the overall openness of each platform.

“AWS came out on top when it came to the raw scoring,” says Brian Boothe, the lead for Pentair’s Connected Products Initiative.

Till date, Pentair has deployed three different connected solutions using the AWS IoT platform and a flexible, scalable, and reusable reference architecture developed by Bright Wolf. The benefits according to Pentair include, accelerated time to market for value-added services, simpler integration, cost savings from deploying commodity edge devices on the open AWS IoT platform enterprise-grade scalability and availability. 

Recruitment success at Vinnter

During the spring and summer we at Vinnter have been working extremely hard to find new employees to hire to the best working place in Göteborg (yes, we might be prejudiced in this judgement… 😁 ). And the Vinnter recruitment success has finally appeared.

We have been working with several tools and services in parallell since the autumn of 2018, but during the early spring we decided to give LinkedIn job ads a chance. This has proven to be the most cost effective way of finding the best applications possible.

The hit rate may sound low, with a 3% hit rate. But we are extremely satisfied. We have held around 50 initial interviews and approximately 20 second interviews. But the best move ever has been when we decided to develop our own test to set in the hands of applicants that pass the first interview. It has been a great help for us to filter out those who really are ready to start with us from a competence perspective.

If you are interested in what opportunities you might find here, please head over to our career pages.

In the end it has resulted in us signing with four new employees! We are super excited to get them onboard (some have already started) and get them to contribute to our customers success and satisfaction with Vinnter as their chosen partner for development of IoT and connected things.

Curious of who we are? Well, just head over to an overview of our team! You can hire a team of these great people if you want!

Benedikt Ivarsson

Benedikt is a cool, calm and collected superhero coming to Sweden from our neighbour Iceland. He has got extensive experience from a range of different employments and companies and his competence is broad.

He is an ambitious an honest leader is doing everything within his power to achieve a satisfied customer and a job well done. He consider it to be easy to look at the bigger picture when it comes to selecting a solution when facing a project.

With his positive attitude he makes a good team player as well as a good leader. His educational background is in Software Engineering, Project Management, IT Infrastructure and Enterprise

Specialties: Finding solutions, Flexible management, Project management, Product development, Pre-sale, Consulting, iOS, Object-C, C, C++, C#, Java, Hosting Services, Integration, Microsoft Sharepoint, Kiosk, NCR, CRM, Provisioning, Mediation, Telco solutions, Health care systems, NATO, Financial Management, Product development.

Jonathan Skårstedt

Jonathan is the quintessence of a developer working at Vinnter. Lots of humour, skilled, experienced and curious to learn new things.

His specialities are many! Some of them are:

Web, Payments, eCommerce, Security, Compilers, Linux, Relational Databases
Languages: Haskell, Java, Clojure, JavaScript, C#, Ruby (on Rails), CSS, Python, Erlang

The never ending discussion of the best emulator is one favourite topic. Jonathan prefers Vim with a cup of tea.

Wishing you all a great summer!

The year of 2019 has been a great one for Vinnter. Lots of things have happened and much more is still to change before we may consider ourselves “done”. Of course we never get “done”, since this is not how the world works. Our customers are in constant change and so do we need to be. But it is not for the sake of changing, it is about adapting to new circumstances in our business as well as within our customer’s business.

Vinnter started out with a focus on development of electronics hardware for our customers, accompanied by some software development needed to get the electronics to adapt to customer requirements and connect to the Internet. Today our main business and our customer’s requirements has changed a bit and our focus has turned towards more software related development, both within embedded development and cloud service development. It is also extremely clear that cloud services are here to stay when it comes to connected things.

During the year we have signed on three new employees and we are looking to employ even more. We have more to do than ever before and are looking towards a great year both financially and assignment wise.

We look forward to interacting with you all again after the summer vacations! Meanwhile we would like to wish you a great summer where the focus lies on relaxation and regaining energy through great activities with friends and family.

From all of us; HAVE A GREAT SUMMER!