Brief history of IoT

Brief history of IoT

The Internet of Things, also known as IoT, has its roots in the late 1990s and early 2000s, when various researchers and technology companies started exploring the potential of connecting everyday objects to the internet.

The term “Internet of Things” was first used in 1999 by Kevin Ashton (a British technologist) to describe a world where everyday objects are connected to the internet and can share data between them. At the time, the idea was still theoretical, and it would take several more years for the technology to catch up with the vision.

In the early 2000s, the development of wireless networks, such as Wi-Fi and cellular networks, made it possible to connect more and more devices to the internet. Around this time, some companies saw the importance of IoT and started to develop devices that could connect to the internet at home.

In the late 2000s and early 2010s, the growth of cloud computing and big data analytics made it possible to collect, store, and analyze large amounts of data from connected devices. This advance was a high catalyst for a bigger adoption of Internet of Things products in more markets, such as industries, manufacturing, healthcare, and transportation.

This is a brief story of IoT. It serves as a way to understand since when this technology has been active in the market. Nowadays, the IoT evolution has gotten to such a point where there are billions of devices connected to the internet in the world.

Timeline of the IoT – when IoT was invented

Let’s see a short timeline of the evolution of the Internet of Things from 1999 to 2021:

1999: Kevin Ashton coined the term “Internet of Things” while working at the Auto-ID Center at MIT.

2000-2005: The development of wireless networks, such as Wi-Fi and cellular networks, makes it possible to connect more devices to the internet. Some companies started to develop IoT products for homes.

2006: The launch of Amazon Web Services (AWS) makes it easier for developers to build and scale IoT applications.

2008: The release of the Android operating system opens up new possibilities for connected devices and the IoT.

2010-2012: Cloud computing and big data analytics give a higher boost to data collecting, storage and analysis.

2014: The IoT becomes a major topic at technology conferences and events, and investment in IoT startups begins to increase.

2015-2017: The launch of IoT platforms, such as AWS IoT and Microsoft Azure IoT, makes it easier for developers to build and deploy IoT applications. The number of connected devices continues to grow, with billions of devices now connected to the internet.

2018-2020: The implementation of new standards, such as 5G and Edge computing (see IoT Edge), helps to improve the performance and reliability of the IoT. The use of IoT devices in the industrial and manufacturing sectors continues to increase, with many companies adopting Industry 4.0 and the “smart factory” concept.

2021-present: The use of IoT in areas such as healthcare, agriculture, transportation, retail, consumer electronics, and energy management continues to increase, and the development of new technologies, such as artificial intelligence and blockchain, is opening up new possibilities for the IoT.

ebook - prototyping costs guide
Get our quick guide for planning the budget of your prototype
Do you have questions? Contact Us!

Leave a Comment

Your email address will not be published. Required fields are marked *

Related Posts

Agile hardware development

Agile hardware development explained

At DeepSea Developments, we have been implementing a hardware development methodology for a few years, and it has given outstanding results. We are talking about

Talk with an IoT Expert