Reading Time: 9 Minutes
The term ‘Internet of Things’ is coined by Kevin Ashton co-founder of the Auto-ID Center at the Massachusetts Institute of Technology (MIT). Internet of Things is just an ecosystem, the physical things which are connected to the internet.
These physical things include smartphones, tablets, wearable device and the different devices which contain different types of the sensor within them.
This IoT has the ability to transfer data over the network.
Hardware/Sensing Layer - This is the very first layer of the IoT architecture. It consists of the hardware device which is connected with the Network layer. Wireless Sensor Networks (WSN) and radio-frequency identification (RFID) are considered as the two main building blocks of the IoT. The Arduino Microcontrollers are directly connected with the sensors. The Arduino microcontrollers are also connected to the Raspberry Pi which is also connected to the internet using Ethernet or WiFi. It transmits the data collected from the sensor in real-time to the server.
Network/Gateway Layer - This layer acts as the bridge in between Hardware /Sensing Layer and Management Layer. This layer receives the digitized data and routes it over wired LANs, Wi-Fi, or the Internet for further processing. There is a different protocol which can be used to communicate between IoT gateway and servers like MQTT, AMQP, COAP and HTTP. We have discussed different protocols used in the next section.
Management Layer - This is the layer which is responsible for the data modeling and security control. This is the layer where all the data handling or data processing operation will be done. At this layer, the necessary data is extracted from the data transferred by the sensor.
Application Layer - This is the final layer of the IoT architecture. This layer uses the data processed by the management layer.
Real-Time Stream processing refers to the data processing with the data stream collected from the IoT device in Real-Time. These are the tasks which can be included in this processing -
Transformation - It includes the conversion of the data which is collected from the IoT device. After this conversion, the resulting data is transferred for further analytics.
Data Enrichment - Data enrichment process is the operation in which the sensor collected raw data is combined with the other dataset to get the results.
Storing Data - This task includes storing the data at the required storage location.
You May Also Love to Read - Data Ingestion Using Apache Nifi For Building Data Lake Using Twitter Data
MQTT - MQTT protocol uses a publish/subscribe architecture. The central communication point of this protocol is MQTT broker. Every client includes a topic name while publishing data to the broker. Topics are responsible for routing information for the broker. Each client that wants to receive messages subscribes to a particular topic and the broker delivers all messages with the matching topic to the client.
COAP - COAP Protocol (Constrained Application Protocol) is a web-based protocol that has been designed to connect with lightweight devices to the Internet of things(IoT). Like HTTP protocols, COAP is also used Request-Response model. It also allows to make API calls GET, PUT, POST, DELETE data via URL.
AMQP - AMQP (Advanced Messaging Queuing Protocol) is an open standard for passing messages in between applications and organizations. It connects system, provides business processes with the information they need.
HTTP - This is the standard protocol for the web services and still will be used in IoT solutions. The most popular architectural style called RESTFul is widely used on mobile and web application and must be considered on IoT Solutions.
You May Also Love To Read - Real-Time Streaming Data Analytics For IoT and Big Data
This will be the first step of data handling from the IoT device. We have different option to collect data from the sensors. Below we are going to discuss the data ingestion using a different platform.
By Using Apache MiNiFi on Raspberry - We can use Apache MiNiFi which is a sub-project of Apache NiFi on the Raspberry Pi. This a very lightweight and low resource consuming agent. From this, we can collect data from the sensor and route it the Apache NiFi. From here we can route the data to multiple required destinations.
By Direct ingestion from MQTT Broker - The second way is to ingest data directly from the MQTT broker and routing it to the multiple required destinations. For this, we need to design the data flow pipeline using different Apache NiFi processors.
By Using StreamSets data collector Edge - StreamSets data collector Edge is an ultralight agent for the IoT. Which collects the data sends it to the streamsets.
By Direct ingestion from MQTT broker - The second way is to ingest data directly from the MQTT broker and routing it to the multiple required destinations. For this, we need to design the data flow pipeline using different Streamsets processors.
The speed of data stream can also vary according to the time. So, Our Data Pipeline should be able to handle data of any size and at any velocity. The platform should be intelligent enough to scale up and down automatically according to load on the data pipeline.
After the process now the most important part is the visualization of the data. Xenonify offers the custom dashboard for the visualization of the data. We can build our dashboard by using Kibana and Grafana. We can also build our custom dashboard using React JS and D3.JS.
Cloud Pub/Sub also natively connects to other Cloud Platform services, gluing together data import, data pipelines, and storage systems.
After ingesting the data from the IoT device Cloud Pub/Sub further sends the data to the Cloud Dataflow for the data processing. Cloud Dataflow is used to create the data pipeline to perform some data transformations.
Cloud Dataflow sends the processed data to the BigQuery. BigQuery provides a fully managed data warehouse with a familiar, SQL-like interface. And further Cloud DataLab is used for the data representation.
In AWS IoT architecture, Kinesis Stream is used for the data ingestion. For this, we need to define the Kinesis action which will collect data from MQTT and send it to Kinesis Analytics for the further processing. After the stream-processing, the processed data is sent to Amazon Redshift and Amazon S3.
In this, we can use Amazon QuickSight for the data representation. In Amazon QuickSight, we can build our visualization dashboards perform ad hoc analysis, and quickly get business insights from your data.
Microsoft Azure Events Hub connects to MQTT via Cloud Gateway and consumes the data published by Raspberry Pi on the MQTT broker.
Azure provides Stream Analytics to the data processing. In-Stream Analytics, we can use the data stream from Events Hub and process the real-time data stream.
DHT22 is a 4 pin sensor. It can be operated on 3.3V which makes it compatible with Raspberry Pi 3.
Pin 1 requires being connected with 3.3V power source. Pin 2 to the GPIO(General purpose input-output) on the Raspberry Pi. Pin 4 to the Ground(GND) and Pin 3 remains disconnected. A 10kΩ resistor is connected between pin 1 and pin 2.
Firstly, we need to make sure that SSH login is enabled on our Raspberry Pi. Then we need to connect to Raspberry Pi using ssh.
It is always necessary to ensure your installed packages are up to date.
First, we need to run sudo apt-get update to update the list of available packages and versions.
Now we can update our installed packages using sudo apt-get dist-upgrade.
Now, we are using sensor DHT22 which needs some additional packages.
DHT22 is accessed by using python library. So we need to install python libraries using this -
sudo apt-get install python
Now we are going to download the libraries which we will use for reading data from the sensor. We will use the libraries provided by Adafruit. So, we will clone this repository into our Raspberry Pi -
Now, all the installations have been completed and we can test the readings from the sensor. For this, we will go the below folder -
Now, we need to know on which GPIO pin the data pin of the sensor is connected and which sensor we are using.
This is the pin diagram for Raspberry Pi 3 -
Now, we are using the sensor DHT22 and GPIO pin is GPIO22. So, we will execute this -
./AdafruitDHT.py 22 22
This will give results as below -
So, we can modify the AdafruitDHT.py script and enable that to push data on the MQTT broker. This is all we need to set up to on the Raspberry Pi 3 and push data on the MQTT broker.
Healthcare sector is one of the fastest sectors which is adopting the Internet of Things. There are a lot of sensors coming into the scenario and healthcare sector is adopting it very fast. These devices are allowing the doctor to track their patients whether they are stuck to the treatment or not.
IoT enabled machinery will enable operation managers to manage the factory units remotely.The use of IoT sensors in manufacturing equipment enables condition-based alerts that are designed to function within specific temperature and vibration range.
Keep vehicles on the road by predicting maintenance need and Streamlining logistics using real-time data and alerts to optimize delivery routes, monitor performance, and quickly respond to delays or issues as they happen. We can ease traffic congestion with the help of using real-time analytics.
Predictive equipment maintenance is used for managing energy, predicting equipment failure or detecting other issues.IoT in retail helps to enable precise inventory management, and most importantly enhancing the consumer’s shopping experience.
IoT helps to Connect, reduce management and utility costs by learning from data you collected. IoT for smart buildings helps to personalise and automate your building’s heating and cooling, room utilisation, which further helps create a more comfortable and productive work environment.
XenonStack Real-Time Streaming Analytics Solutions for Enterprises & Startups -
Elixir Data is a Modern Big Data Integration Platform that enables secure Data Pipeline With Data Integrity and Data Management. Elixir Data provides you with the freedom to work in desired Data Stack and Programming Language as it integrates well with NoSQL & Big Data Ecosystem, traditional databases, and business tools. Elixir Data provides the Public, Private or Hybrid mode of Code Deployment. Choose the cloud platform that suits your enterprise requirements.
Real-Time Big Data and Advanced Analytics Solutions refine your continuous flow of data into useful information that fuels Real-Time Data Analytics. Build Real-Time Analytics Cloud using Apache Spark, Apache Flink, and Apache Beam. Real-Time Analytics Solutions offers Data Ingestion and Data Processing for IoT and Big Data Solutions, Real-Time Data Processing on Docker and Kubernetes, and Big Data Analytics Platform With Apache Spark and Apache Flink
Xenonify is an Enterprise Full Stack IoT Platform with Artificial Intelligence and Machine Learning. Xenonify offers Identify & Access Management, IoT Protocols and Messaging, Streaming & Real-Time Data Integration & Analytics for IoT Solution, Real-Time Predictive and Preventive Intelligence, and Microservices, Docker and Kubernetes for the Internet of Things.fetc
Ready To Discuss Your Requirements Request Free Consultation