IoT Insights with InfluxDB and Golioth: Real-Time Monitoring Made Simple
Session date: Jan 14, 2025 08:00am (Pacific Time)
IoT deployments demand efficient and reliable pathways to process and analyze data from constrained devices operating in remote environments. This webinar dives into the technical collaboration between Golioth and InfluxData, showcasing how Golioth’s lightweight SDK (software development kit) seamlessly bridges device firmware with cloud infrastructure—while InfluxDB serves as the cornerstone for real-time monitoring and time-series data analytics.
Join us to discover how this integration empowers hardware engineers to streamline data pipelines, optimize device performance, and unlock valuable, actionable insights with minimal complexity.
This webinar will highlight:
- How Golioth helps simplify device-to-cloud data transmission using a lightweight SDK for power-constrained devices.
- The unique advantages of integrating Golioth with InfluxDB for real-time monitoring and historical data analysis.
- Key IoT use cases, like waste optimization and preventative maintenance, that showcase the power of IoT data insights.
Watch the Webinar
Watch the webinar “IoT Insights with InfluxDB and Golioth: Real-Time Monitoring Made Simple” by filling out the form and clicking on the Watch Webinar button on the right. This will open the recording.
[et_pb_toggle _builder_version=”3.17.6” title=”Transcript” title_font_size=”26” border_width_all=”0px” border_width_bottom=”1px” module_class=”transcript-toggle” closed_toggle_background_color=”rgba(255,255,255,0)”]
Here is an unedited transcript of the webinar “IoT Insights with InfluxDB and Golioth: Real-Time Monitoring Made Simple.” This is provided for those who prefer to read than watch the webinar. Please note that the transcript is raw. We apologize for any transcribing errors. Speakers:
- Dylan Swartz: Head of Product, Golioth
- Suyash Joshi: Developer Advocate, InfluxData
SUYASH JOSHI: 00:00
Thank you for joining us today. Right before getting started, I would like to remind everyone some housekeeping. The webinar is being recorded and will be shared and made available on demand in the next 24 hours, along with the slides. If you have any questions, you can use the Q&A section at the bottom of your screen. We will answer these at the end, or you can just type them away in the chat. Don’t forget to check out our InfluxDB community, our Slack channels, our newly created Discord for more discussions following the webinar. That said, we are so excited to have you join us today for our webinar on IoT and Real-Time Monitoring using Golioth and InfluxDB. My name is Suyash Joshi. I work as a senior developer advocate at Influx Data. And with me, our main speaker today is Dylan. Dylan is the head of product at Golioth. Dylan, take it away. Stage is yours.
DYLAN SWARTZ: 01:03
Awesome. Thanks, Suyash. I’m excited to be here. As I mentioned, my name is Dylan, head of product at Golioth. Seems like a lot of folks are new to Golioth, so maybe I’ll spend a bit of time explaining what we do before we dig in. So Golioth is an IoT development platform. We were chatting earlier. It’s kind of interesting Golioth doesn’t actually make hardware. We’re mainly a cloud platform. We provide SDKs for firmware developers to connect pretty much any kind of IoT device. And I’ll explain more of the sort of class of devices that we focus in on. But we’re going to explore how Golioth and Influx can make Real-Time Monitoring from small, constrained IoT devices possible. I guess digging in here, what do we mean by IoT? Because IoT is very ambiguous. And at a high level, a lot of folks will think of IoT being a smart light bulb. You can change the color of it from your phone. There’s even kiosks and menus and displays and stuff like that. That’s sometimes considered IoT. Kitchen appliances, vehicles, and cars are connected today. If you’re a developer, a lot of folks might think of the Raspberry Pi, for example, but that’s not what we’re talking about here today. We’re talking about tiny microcontrollers. And I don’t know if you can see my little camera, but what I’m holding in my hand here is an ESP32. These devices are inside of all kinds of things ranging from trash cans that are outside as you walk through your city to the connected scooters and ride-sharing bikes and that sort of thing.
DYLAN SWARTZ: 02:58
That’s sort of the class of IoT devices we’re going to be talking about today. Microcontrollers rather than full single board computers oftentimes don’t have a full-blown operating system, but more of a real-time operating system. A lot of times, these devices rely on batteries, so they need to maximize energy efficiency as much as possible. And as we get into some use cases, you’ll see they operate in sometimes pretty challenging environments. But ultimately, they’re very small, usually designed for very mission-critical, very specific tasks. And they face challenges that are very different from the everyday devices that we use, like a smartphone or a PC with a web application. So, grounding in some real-world use cases here, one that’s near and dear to my heart. You can see all the plants in my background is environmental monitoring inside of greenhouses, often for producing indoor crops or even house plants like behind me, it’s important to control conditions like temperature, humidity, monitor the moisture of the soils, even CO2 levels, that sort of thing. These metrics need to be continuously tracked to optimize sort of the plant growth and farmers to optimize the yield of the crops and even prevent issues like crop loss.
DYLAN SWARTZ: 04:33
This data needs to be streamed to the cloud efficiently to make real-time decisions because, say, the power goes out and the watering system fails, you might not know about that until the next day in hydroponics. All the crops would be dead. A real example of a customer that Golioth has, the middle image here, structural monitoring of infrastructure like bridges, buildings, roads. We have a customer that embeds pressure sensors inside of concrete to monitor the stress levels and detect if buildings or bridges are deteriorating. These sensors generate time series data of sort of the pressure on different points within that bridge over time. So, you can predict ahead of time maintenance that needs to be done to either extend the lifespan or prevent critical failure. And these devices, as you can imagine, need to be super reliable, low power because they’re literally embedded in concrete, and they need to last years and years and years. And they’re often in hard-to-access locations. They need to be able to be remotely monitored and managed. Another common use case inside of factories or industrial industries is monitoring equipment, heavy machinery. A lot of times it’s done either via vibration sensors. I’ve even heard of anomaly detection machines done through microphones and detecting sound to mitigate potential failures.
DYLAN SWARTZ: 06:15
Again, time series data helps kind of build up a model where you can start to do predictive maintenance and get ahead of problems and send out a technician before machines fail and cause downtime for a business or a factory. One thing that’s interesting about these environments is oftentimes there’s a lot of interference from all the metal, other machines. So, connectivity in these sorts of environments is often challenging. And then finally Golioth has a number of customers in this space, which is waste management. So, trash cans, dumpsters all around cities, even within buildings stream up effectively how full they are. And this is usually done by a sensor that sends a laser down and bounces back, and it measures the distance from the lid to the bottom and can determine how much trash is inside of there. Or there’s other sensors like accelerometer to determine if it’s been knocked over and that sort of thing. And these can all create a map of effectively all the trash in the city and optimize the route for a garbage truck to drive around and only pick up and service the dumpsters and the bins that need to be emptying rather than hitting every single one and sort of wasting time and money there. The key thing or the common thread through all of these is typically they generate time series data. And that time series data is a data point that has, say, for that greenhouse use case, temperature, humidity, CO2 levels with a timestamp at a very specific moment in time. That data usually is sequential coming in at an interval.
DYLAN SWARTZ: 08:05
So, every five minutes, every 10 minutes, maybe once an hour. And that data, kind of if you sum it up from thousands of devices, forms this continuous stream or fire hose of data into some sort of cloud system reflecting changes over time across thousands, potentially, of devices. So, it’s very high volume. And as a cloud infrastructure builder, it takes a lot of work and effort to maintain systems like that. And every single IoT product really shouldn’t have to build all that infrastructure themselves. If you’re monitoring a greenhouse, you don’t want to worry about site reliability and that sort of thing. So processing, storing, analyzing this data in real time requires purpose-built systems that can handle those unique demands of time series data. Suyash, do you want to maybe chat a little bit more about time series data?
SUYASH JOSHI: 09:10
Yes. So, like Dylan was mentioning, what time series data looks like, and we saw a little screenshot. Well, some people ask me, “Can I just store this data in a normal, like a relational database?” Because you can create a column with a timestamp. And one can, but it’s just not the ideal way to do it, especially at scale, especially when you want to do a real-time query. The relational databases aren’t meant for that or for a matter of fact, I like to show this slide because most people are familiar with your big PostgreSQL, Oracle, MySQL databases, which have been there since 1970s, if not earlier. And we have been using them for all kinds of use cases. But as the technology industry has been evolving, databases have been evolving. So, we started the relational database, still use them for generally transactions type of data. For example, e-commerce, right? You don’t care about time. You just want to know how many purchases were made like past month or when you have a login system on your website. You would use something like that for that. But then as databases evolved, there was a need for unstructured data. And there came MongoDB, which is you can call it a document storage database, such as, “We want to store all these chats that are happening.” Or these days, you can imagine ChatGPT. Where would you store all the text? You would store something like a NoSQL database.
SUYASH JOSHI: 10:57
Then there was a need for a particular type of search. And for example, we want to search on our website. How do we build a good search index? How do we search just like full word or keyword or just some random text. And there came the need for search database. And Elastic is the leader in that space. Similarly, for time series, InfluxDB was one of the first ones, the major ones. There have been iterations of that previously as well. But since its launch, it has been the leader in the category. And it’s not just me speaking because I work at the company. But if you go to this website called DB-Engines, it’s a third-party website that ranks databases across various categories. And for time series, you can go in today and see the metrics. They do various benchmarking, and it’s the leader by far. So today, IoT is the perfect use case for time series data, like Dylan mentioned. And we’re going to see various use cases for that. So InfluxDB, it comes in different flavors. There is at the heart of our open-source database, but then we also have various cloud flavors for that. So back to you, Dylan.
DYLAN SWARTZ: 12:15
Awesome. And I will say, sort of anecdotally, many of Golioth’s most successful customers are using InfluxDB. We built the initial integration with you guys at the request of one of our larger customers. So can confirm that folks like it and use it for this use case. So anyway, we were talking about the time series data and the constraints of these different IoT devices. And across multiple different industries, we have these tiny little devices that are often in these constrained environments such that they must conserve power. We mentioned the concrete sensors. If you’re trying to get a device to last 5, 10 years, the things that you need to do to conserve power oftentimes are very extreme. These devices are in situations with limited bandwidth. You may be on a cellular network like LTE-M or narrowband LTE, which is much lower bandwidth than the 5G that your smartphone uses. And they can be in rural areas where connectivity coverage is not great. The task that these devices have is also very important, even though they’re in such challenging environments, which is sensing and pulling data from the physical world and making that digital. So why is this a problem? I mean, why can’t these devices just call an API and send the data to the cloud?
DYLAN SWARTZ: 14:06
We have a lot of different modern programming languages and libraries today that let you sort of make HTTP requests and that sort of thing. But the problem is that these tiny microcontrollers, especially when you’re concerned about limiting power consumption or you only have 256 kilobytes of RAM, they can’t handle the overhead of a networking protocol like HTTP. There are other protocols like MQTT that you may have heard of, but those also are oftentimes too power hungry for these sorts of use cases when you have limited processing power and memory. With the need to conserve battery life for up to 10 years, like the concrete use case, I think the waste management folks that I’ve chatted with usually are targeting two to three years of battery life. Everything you can do to save power and memory is important. As I mentioned, the devices and factories often have lots of network interference as well. So, you’re dealing with network connectivity that’s constantly dropping, potentially losing packets. So, you need to sort of have a solution that can be resilient to those sorts of network conditions. So basic things you would do in a web application, like calling an HTTPS API may not even work or overwhelm these devices and drastically reduce their battery life.
DYLAN SWARTZ: 15:51
Influx has tons of tools or connectors to ingest data like Telegraph, and they’re amazing, not just Influx, but cloud technologies in general oftentimes are built for more powerful devices like NVIDIA Jetson or your PC. And this simply doesn’t work for these microcontroller-based devices. And that’s why Golioth exists effectively. So, we specialize in bridging that gap between the hyper-constrained IoT devices and the unconstrained world of cloud-based systems. Our firmware SDK abstracts away all that complexity of securely connecting to the cloud over an efficient protocol. Under the hood, we use a protocol called CoAP, Constrained Application Protocol. But we abstract that away so that you as a firmware developer don’t need to learn all these esoteric networking technologies. And then we handle efficiently transmitting that data to the cloud, even on low bandwidth networks, like I mentioned, narrowband or LTE-M. And we handle end-to-end encryption and authentication. So, you as the developer don’t need to deal with all that stuff. So, what it would look like as the firmware developer is, you simply call a function within our SDK called stream, pass in that temperature, humidity data to that function call. And our SDK takes care of doing that security handshake, formatting the data, the networking so that you can focus on building your application.
DYLAN SWARTZ: 17:36
The question is then, how does that all flow into Influx? Where does Influx fit in? And Golioth has a product called Pipelines that allows you to seamlessly kind of stream that data once it gets to our cloud to any sort of destination that you want. Whether this be Influx or your own backend API or event queuing mechanism like Google Cloud Pub/Sub or Amazon SQS, your device sends that data using our firmware SDK over secure protocol. Golioth handles the security, the retries, the buffering to ensure reliable delivery. And then that data is then routed to Influx through sort of a no-code connector, which you’ll see in a bit. And we designed pipelines to sort of minimize that complexity of managing IoT data streams while ensuring scalability and reliability. So, whether you’re working on a single device as an individual developer or managing a fleet of thousands, we make sure that the data gets to where you need it to go. And I see there’s a question in the chat about how Golioth compares to something like LoRaWAN. LoRaWAN is a networking technology. We are a cloud platform. So, we support many different networking technologies from cellular, Wi-Fi, Ethernet. We have some folks using things like LoRaWAN or Thread or Bluetooth on sort of end devices. Those usually require a gateway then that has the network connectivity. And we can help kind of facilitate the communication across those and then up into the cloud.
DYLAN SWARTZ: 19:24
And maybe this architecture diagram will help explain a little bit. I don’t have the topology of sort of an end LoRa node and the LoRaWAN gateway or a Bluetooth device, but we can support that topology as well. But starting on the left, we have devices. You can imagine these be sensors in a greenhouse or sensors in some trash cans around the city. They’re running firmware that leverages the Golioth firmware SDK. And this is open source on GitHub. I can point to all these links at the end. These devices will then establish a secure connection to Golioth’s Cloud using DTLS for the security layer and CoAP as the networking protocol, specifically designed for these constrained devices. The firmware SDK sort of, as I mentioned, abstracts all this complicated networking bit. Once the data gets into the cloud, we sort of have three different areas of our products that handle the different needs of an IoT device. There’s a box in the middle that says device management. We’re not going to talk too much about today, but you can think of this as functionality that every single IoT device needs, like firmware updates in the field. We want to be able to deploy updates to these devices, streaming logs for debugging, configuring settings maybe like a log level or a debug mode, that sort of thing. We have a whole bunch of features around that, and we handle all those features along the same secure channel as you would with data streaming services.
DYLAN SWARTZ: 21:14
Then the box at the bottom is kind of the key piece we want to talk about here, which is our data streaming services. So, we handle data ingestion. Our pipelines product also gives you the ability to do data transformation and sort of modify or clean up data, help translate it from that low-power constrained device world that you might be sending out protobufs or some kind of binary data stream and then convert it into a JSON format for your backend API or line protocol for Influx. The data will flow through there and then you can connect it to any backend destination, whether it be your database or RabbitMQ or Kafka to get it into your database. And our data services or pipelines infrastructure is called highly reliable. It uses a Pub/Sub architecture ensuring data is queued even if that downstream destination, for example, is offline. Maybe you’re hosting your own Influx, that sort of thing. So, we can kind of queue that up. And users configure how this data flows through YAML files. I’ll show you how that works in a demo in a bit. But for example, you can define a policy that will say all my temperature and humidity readings flow to Influx, but then my crash dumps and logs flow to Datadog or something like Memfault for IoT device debugging. But as I mentioned, a lot of our customers, the end destination here where it says database is going to end up being Influx. So, I’ll let Suyash chat a little bit more about that.
SUYASH JOSHI: 23:03
Yep. Well, for those who are already familiar with InfluxDB, you might be familiar with these things. What you’re seeing right now for those who are new, the whole InfluxDB platform has a couple of different components. And when I say components such as data collection, how do you collect data? And in the next slide, I’ll show you this in a deeper way. There’s a scripting language to interact with your data. Of course, you can also connect your applications via our SDKs and APIs. And at the heart is the database engine. The new announcement just happened yesterday is we have InfluxDB 3. Two flavors of that have come out in alpha release. So, the first one is the core, which is our open-source version. And the second one is the InfluxDB 3 Enterprise, which is the commercial offering. It’s a single node offering that enables a lot of things, including compaction and allows you to scale horizontally. And then we’ve had these before for some time, which are our Serverless offering, Cloud Dedicated offering, and Clustered offering. So, there are different flavors of InfluxDB for different use cases. I’m happy to answer any questions if you have around these. But just wanted to give you this overview. If we go to the next one. And this slide explains more in context of the industrial IoT ecosystem.
SUYASH JOSHI: 24:46
So, like Dylan was showing, their devices and their SDKs is used at various use cases such as like in plant and machinery. So that is kind of the source of the data, if we look at it from the data pipeline point of view. Your various industrial assets, various sensors. All of this is generating the data. And then you want to collect this data meaningfully. So, there are various ways to do that. We have a tool which is open source. A lot of you might already be familiar with that, which is called Telegraph. And it’s a tool that allows you to plug into different networking protocols such as MQTT, Modbus, various other protocols that are very common in industrial IoT settings. And it can take in data from that, bring it to InfluxDB. You can directly connect your assets if they can run, let’s say, a Python program, a Python server. If you’re running it on some kind of a Linux environment, then you can directly send the data to InfluxDB using client libraries. Or you might want to use Middleware, other third-party tools for very specific needs, such as Golioth. And once the data is collected, it’s stored in the database. Various things happen in the database. So, the database engine, it has to manage all this data in a meaningful way, in a logical way, in an optimal way. So, we use a lot of open-source technologies ourselves.
SUYASH JOSHI: 26:27
So, there is something called Apache Parquet, which is a type of a format to store files in a very efficient manner. There is a SQL query database engine, which is powered by Apache Data Fusion. We keep data in memory so that you can quickly query it. So, we use something called Apache Arrow for that. So, there are a lot of open-source technologies and our own secret technology that go into the heart of the database engine. And then the last layer or the pipeline, if you look at it that way, is when you want to get the data out, you can, of course, run SQL queries. You can connect via, again, our APIs, SDKs to query data, or other third-party tools. A very popular one is Grafana, and I think we’ll see in the demo today, to visualize the data, create real-time dashboard, especially for real-time use cases. Or if you want to take the data to a data lake, so you can take Apache Parquet files into a data lake where you can do machine learning on it or train the data for some other use cases. And there are other third-party tools. So, we integrate with a lot of different tools. And all of these are available on our website. And I can be happy to answer any questions that you may have. But with that said, let’s go back to Dylan.
DYLAN SWARTZ: 27:56
Awesome. All these different integrations and sort of Golioth being pictured there as a middleware, I think, is a good representation of how we work together. Some of our philosophy at Golioth is that you should use the best tool for the job. So, we consider ourselves a universal connector to whatever technology or tool that you might need to use to visualize your data or build your application. That’s why I’m excited we’re working together. With effectively a single line of code and some YAML, we can unlock streaming and storing time series data at scale. And that opens innovation for things like condition monitoring, monitoring vibration, and industrial equipment to detect anomalies before a critical machine breaks down or fails. Environmental monitoring. We have a nonprofit customer at Golioth that’s tracking air quality, monitoring climate change effectively all around the world in all kinds of different remote locations. In sort of commercial settings, you can use that information to trigger alerts when certain thresholds are exceeded and that sort of thing. And as I mentioned, for the demo today, we’re going to focus on this greenhouse monitoring application. As you can see, I have a lot of plants behind me that I need to take care of.
DYLAN SWARTZ: 29:27
In Chicago here, it is 14 degrees Fahrenheit outside. My heat keeps running and drying the place out. So, I have a humidifier running that trying to keep the plants all happy and healthy for my webinar background. But yeah, so let’s dive into it. What we’re going to see here in the demo is sensor data being collected using a Thingy:91 from a company called Nordic Semiconductors. It’s using the nRF9160 cellular modem from Nordic. And it has a couple of different sensors on board to collect humidity, temperature, light sensing, a bunch of different sensors because it’s a prototyping dev board type of platform. We’re going to stream that data into Influx via Golioth. And we’re going to visualize that inside of Grafana over time and do some real-time alerting as well. Because when my humidifier runs out, the humidity drops, and I’m going to want to know so I can refill it. I mentioned the Thingy:91. I have that dangling behind me. I’ll turn the screen share off in a second and show it on the camera. But this is a board that you can buy from Nordic Semiconductors. It’s kind of like an out-of-the-box working prototyping kit that can be configured to just automatically stream all kinds of different sensor information. They just came out with a new one as well called the Thingy:91 X, which is actually cheaper. And it has Wi-Fi and cellular on board, which is pretty cool.
DYLAN SWARTZ: 31:11
What I’m going to be using in the demo is open source code. We have a Golioth example running on this Thingy:91 hardware that it’s kind of like the kitchen sink demo for us. It uses all the different functionality from logging, doing remote firmware updates, and streaming data. So, for this demo, I effectively just downloaded this and flashed on the device. I’m not going to do that live because it takes a few minutes to build and all that. But if you go to the releases here, if you pick up a Thingy:91, you can download even a pre-built binary and flash it just to try out Golioth. I will end the screen share for a second so that you can all see it larger. So, I’ve got this here hanging by all the plants. And you might be able to see there’s some mist coming. So that’s the humidifier. But this little orange thing is a cellular antenna. The actual code is running on this little chip. So, it’s super tiny. This is really the— When I’m saying a microcontroller, it’s this device here. The rest of it is all PCB sensors, the LEDs, that sort of thing. So, it’s a little big because it’s a development prototyping tool. But this device is running the Golioth Firmware SDK, and that’s going to stream up the humidity information to Golioth. I’ll share my screen again here so we can see the Golioth web console.
DYLAN SWARTZ: 33:01
So, as I mentioned, we’re an IoT development platform. And what I mean by that is its all-in-one platform for everything you need to both develop and manage and maintain IoT devices at scale. So, we’re going to focus on the data streaming piece. But we have a whole dashboard to manage your devices, sort of visualize how much data that they’re streaming so you can kind of be conscious of your cellular bills, different information about when they last connected, so you can debug why is my device offline, manage security credentials, deploy firmware updates. You can bi-directionally sync application state to and from the cloud. You can send up debugging logs, that sort of thing. And all this is kind of just that default example firmware that I showed on GitHub that you can download. Run that and configure the credentials, and you’ll see all of this populate in the Golioth console. For the data streaming piece, by default, we have a really lightweight sort of time series service that really a lot of folks use for prototyping just so you can quickly see the data in here, in the console. This is all stored on Golioth, but only for 30 days by default. It’s not meant for building your production applications on top of. As I mentioned, we have something called pipelines. Pipelines by default are configured to kind of stream all that data and temporarily store it in Golioth in that stream service that I mentioned. And this is by default set up and configured for you to route the data coming in.
DYLAN SWARTZ: 34:55
You can filter it based on a path. So, if I wanted to do here instead of asterisks for everything, I could say I just want my sensor data that’s being sent in on that path to go here. This is kind of set up for you. You can see it kind of accepts batches of information. So that’s one step that you can do is batch information and send it to a destination. We can transform it. So, we extract the timestamp from the payload rather than, say, the timestamp of the request coming in and then send that to various destinations. So, I actually don’t want to. And kind of like a lot of customers of Golioth sort of graduate off of this stream service that we have. I don’t want to use this. So, I can turn this off and my data will stop kind of being stored on Golioth and instead use this additional pipeline that I have set up here that streams data into Influx. So, I can configure, again, all my data that’s of type JSON. You can additionally filter not just by path, but type of data. So, if you have, say, images coming in like from a smart camera, you can send that to, say, Amazon S3, and then your JSON data send that to Influx. I want to send that to the destination Influx. And all of this is kind of outlined in our documentation how to configure these pipelines and different destinations.
DYLAN SWARTZ: 36:35
So, if you go to Influx Destination or Docs, you can quite literally copy and paste this. Or in our examples down here, we have an Influx example that you can just click this button, use this pipeline, and it’ll open it up in your Golioth account. You can pick the project that you want to put it in, and it’ll create a brand new pipeline with this configuration. You just hit Save after putting in your own API key. It will just start streaming it to your database. So, this one here is already kind of set up. We have something called Secrets that allow you to reference variables here. So, I already added my Influx API key as a secret. I have a bucket inside of Influx called the Thingy Data for the Thingy:91. And then my measurement is sensor because I have a top-level JSON object called Sensor that, if you remember from the screen here, all this data that was sent previously, it has accelerometer data, light sensor readings, and then the environmental conditions, humidity, and temperature are kind of the two that we care about here. But now that we have this pipeline set up, you can see there’s been data streaming through it for a while now. We can go into Influx Cloud and see that data flowing through. So, this was the same bucket I had here. The measurement was sensor. If I run this query to pull data from the last hour, you can see I have data coming through. And we can see the humidity is 56% and the temperature is 21.44 Celsius.
DYLAN SWARTZ: 38:34
You can learn a lot about my conditions in my apartment from watching this webinar. Getting the data into Influx this quickly and easily is awesome, but we don’t have to stop there. We can do a lot with this data. Many folks building IoT devices will usually from here create their own APIs, maybe a mobile app to do something with this data, whether it’s visualizing it or make decisions. A lot of folks may be building internal solutions to monitor, say, their own factory or their assembly lines or their own greenhouses, might just set up a Grafana dashboard to visualize this data. They don’t even need to build any kind of custom infrastructure code around this. So, I have an example of that here. Grafana has a free cloud hosted offering as well as it’s open source and the ability to host it yourself. We’re using the cloud hosted offering here. And I set up a quick little dashboard, which literally took five minutes to create to sort of visualize the temperature and humidity over time. The way this was created is Grafana has something called data sources. And then under data sources, you can add a new data source. And you simply search for InfluxDB. You connect that data source. You choose which query language from Influx you want to use. I chose Flux because it’s got more capabilities than simply SQL. And then you add the URL to your Influx database, which is the same one I had in my pipeline here. And stick that in there.
DYLAN SWARTZ: 40:34
Add your organization ID from Influx and your API token and the name of the bucket. Save that. And magically, all your Influx data is available, assuming you have a read permissions on that API token. We can take a look at the dashboard that’s already set up. So, I’m not exposing API keys on a webinar. So, this temperature and humidity chart here was created by clicking this Add Panel button, but I can also edit this one and show how it’s constructed. So, once you add a data source, when you add a panel to this chart in Grafana, you click on that data source, and then it gives you this interface to perform queries on that data. So, I’m querying that Thingy data bucket and giving it a time range, which Grafana gives you this nice little facility to do variables in your queries. So, it’ll put in your chart the ability to select a time range up at the top and then drop that into your query. And I want to look for my sensor measurements, which has all my environmental data, and then look for the field or filter down to the field of temperature. And then because I’m an American, I mapped that Celsius to a Fahrenheit because my brain doesn’t work when I’m looking at Celsius. We saved that.
DYLAN SWARTZ: 42:19
Looking at these two charts, which are both rated in the same way, we can see that my temperature goes up and then drops back down and up and drops back down kind of in conjunction with the humidity going up as the temperature decreases and then down, which tells me whenever my heat turns on, the humidity drops and then my humidifier is working really hard to try to keep it up. Something else we can glean is around 8:15, I think, is when I took a shower. So, it steamed up my apartment here and the humidity rose. And now it’s starting to drop back down. I like to keep the humidity for the plant’s health up above 50%. And we can see that it’s dropping below there. We can see also when the heat’s on, it really gets down to like 30, which is pretty low for the types of tropical plants that I have. So, I would like to know about that and then try to figure out how to either turn up my humidifier or maybe it ran out, but I can add water to it. Grafana has a cool alerting capability that you can go to— We’ll discard the changes on the dashboard and set up alert rules. So, I have one here set up, but it’s turned off. But either clicking Create Alert Rule or editing one here gives you the same interface. So literally just copy-paste that same query from our dashboard gives us the humidity time series data into this alerting capability.
DYLAN SWARTZ: 44:07
And what this will do is effectively pull this query on a cadence and then a function here reduces it down to the last result. And we can take that last result from the query and do a conditional. Is this value below 50? We can see the data is mapped here as a preview. So, when you’re building this, you can kind of determine is your alert going to fire or not based on your condition. So, we can see it’s at about 45% humidity. That’s below 50, so it should be firing. Kind of tweak this just to see if we preview that again. If it was set to 40, it would not be firing. So, we kind of confirm that this is working. So, I want to be alerted if that goes below 50%. There’s a couple of different configurations. These get pretty advanced. By default, how this works is this is set to 10:00 AM, which is 10 minutes. What it’ll do is keep monitoring it for a period of time and validating that the value is still breaking that conditional before sending that alert. This typically is used in monitoring database systems and stuff like that for errors and logs. So, you don’t want to get blasted with notifications when one sort of anomaly is detected. But in my case, I do want to know immediately if it drops below. So, if you set that to 0, it will send you the alert immediately. This Pause button is how I kind of disable the alert. If you turn this off, it is now evaluating sort of on that cadence. And if we save this alert, the last thing you need to do is configure what you want to happen when that alert is fired.
DYLAN SWARTZ: 46:04
And you can do that with sending API calls or sending emails. I did that through the contact points capability. You set a list of email addresses and a notification policy. So, by default now, when any alert in here fires, it will send me an email, hopefully. And I believe by default, this is set to run every one minute. So, we should see here momentarily if the humidity is still below. We can kind of guarantee it stays below if I unplug it and move it over closer to me so it’s not in the mist. We should be able to see in Influx, if we run this, the humidity is below 50%. It’s at 44. And inside of our alerts, we can see now it just ran and it’s firing. So, it should send me alert now that it’s dropped below 50%. And if I check my email, hopefully I got an email that says it’s fired. Sometimes it takes a second to send. But we can see what that looks like effectively. This is a default. I didn’t really do any customization, but you can make this say or look however you want. New message. So, we just got it. And it’s saying, “Here’s the value, 44%. I can do what I will with this information.” In a real-world setting, you might actually trigger some automation from that. You might turn on a humidifier via a relay. There are all kinds of different things you could do.
DYLAN SWARTZ: 48:02
Sending a call to a webhook rather than sending an email. Start to use this data to build a historical sort of information to detect in advance before the humidity drops and do some sort of preventative measures like turning on the humidifier ahead of the heat coming on, that sort of thing, or even train an AI model to fully automate and control these systems. Yeah, that’s about all that I had for folks today in terms of the demo. Just a couple of helpful links if anybody wants to take a screenshot here. Getting started with Golioth is free, and we have a free forever developer account that comes with some pretty generous usage. We give a whole gigabyte of OTA updates every month for individual developers because we think all ID devices should have remote firmware updates. And for context, these devices are really tiny. So, an update is like a few kilobytes. We also give an allotment of free data streaming. Our docs kind of go through all our different services in terms of reference and guides. But we do also offer a pretty in-depth training, not just on Golioth, but on firmware development with an operating system called Zephyr. And we have a bunch of open-source reference designs as well that sort of have a guide on how to build prototypes of different hardware from many different use cases.
DYLAN SWARTZ: 49:43
And I think the link here is incorrect. So, if you go to projects.golioth.io, there’s a bunch of open-source hardware built with off-the-shelf components that you can buy at DigiKey or anywhere your favorite distributor for electronic components. And we have all kinds of different reference designs here that allow you to not only build the hardware, but then stream the data through Golioth and visualize it in Grafana as well. Awesome.
SUYASH JOSHI: 50:22
All right. Well, thank you, Dylan. That was great. I hope everybody’s very excited to try Golioth. And if you’re new to Influx data, try InfluxDB as well. Any questions at this point? If you have any questions, just type it in the chat or in the Q&A. In the meantime, Dylan, feel free to scan if you want to reply to any comment or question if you see.
DYLAN SWARTZ: 50:57
See a comment about using Belina. Yeah, we know Belina very well. Usually, the Belina OS is more targeted for more powerful devices like the Raspberry Pi single board computers, whereas Golioth targets specifically low-power microcontroller-based devices for some context there.
SUYASH JOSHI: 51:23
I’ll answer one question around open-source InfluxDB, any limitation disadvantages. Well, that is a very tough question to answer because it depends really on the use case. Most of our users, they start with the open-source version. So, I would encourage you to try the open-source version if not already and see if you run into any challenges. At certain point, you might, or you may not, depend really on the use case. The cloud versions or the commercial offerings, they have many benefits from, you don’t have to manage it yourself. You manage it using our cloud service. You can host—we have a version that is commercial that you can also host it within on-premises, so you don’t have to run everything on the cloud as well. But there are other benefits in terms of scaling, in terms of security that come with the paid product that doesn’t come with the open-source version. Any other questions? What about compare it to RabbitMQ or Kafka? Is that a fair comparison with Golioth?
DYLAN SWARTZ: 52:49
I think if you look at purely the data streaming piece, we offer a lot of similar benefits in the sense that it’s an event sort of driven infrastructure streaming data through. We have folks oftentimes that use a RabbitMQ or a Kafka as part of their backend infrastructure. But the problem is these IoT devices can’t talk directly to that for security reasons as well as just the nature of the protocols that these devices need to speak to conserve battery, have reliable transmissions. So, we think of Golioth as a connector to those rather than a replacement for those because the cloud infrastructure likely, as you’re building highly reliable web services, needs to be event-driven as well. So, you can think of Golioth as a connector to those.
SUYASH JOSHI: 53:53
Do you want to take Justin’s question? I can also chime in.
DYLAN SWARTZ: 53:57
Maybe from the Influx perspective, what sort of resources do you see folks using for doing anomaly detection?
SUYASH JOSHI: 54:09
Well, anomaly detection, really, it also comes in different levels of anomaly detection. You can detect easily based on trends or charts and then set up triggers. Something like what Dylan was showing using alerts, right? There can be more complex anomalies that you might need to train your time series data using a time series AI model. So, there are now machine learning models specifically for time series data. And then you would train it so that it knows this is an anomaly. This is not an anomaly, something like that. I mean, there are also large language models these days. So, there are many ways to go about it. There is the AI approach. There is more mathematical approach, the old-school way. There is simpler approach. So, it really would be the additional business logic that you would have to write using— Yeah.
DYLAN SWARTZ: 55:16
We have a couple of folks who use Golioth and have some Edge AI type applications. We have a partner called Edge Impulse. They have a lot of guides on their website if you want to check that out around how to take time series data and use that as training data to train an AI model specifically for embedded IoT use cases. So, they’ll run sort of all the training for that model and then compile it down into a super tiny C library with TensorFlow Lite that can run on a microcontroller. So, you can sort of create this pipeline of data flowing into Influx, using that to then train a model and then take that model and set it back down to the device to continually improve that device to make decisions even sort of locally.
SUYASH JOSHI: 56:19
That’s a great tip. So yeah, Edge Impulse is another hardware manufacturer. They’ve done some great work with ML on tiny Arduino bots. Any other questions, comments? And yeah, you already have— If you didn’t take the screenshot, we will send the recording to you in 24 hours with the slides. So yeah, here it is again. And you can access these resources. When you get the recording, the slide, you will also get our email. Feel free to contact us for any further questions. You can also reply to that email. And yeah, suggest any other topics. I think preventative maintenance is very big in the industrial IoT world. And I think anomaly detection using machine learning could be one specific area that we could do a webinar in future or maybe a hands-on training if we have enough interest on that. So, thanks for joining, everyone. Any last-minute questions before we close? Have a great day wherever you are. And thanks, Dylan, again, for this incredible webinar.
DYLAN SWARTZ: 57:48
Sure.
SUYASH JOSHI: 57:49
Bye.
DYLAN SWARTZ: 57:49
Thank you. Bye, everyone.
[/et_pb_toggle]
Suyash Joshi
Senior Software Engineer and Developer Advocate, InfluxData
Suyash Joshi is a senior software engineer and accomplished developer advocate at InfluxData. His career spans leading pivotal projects at Oracle and RingCentral in the past. Holding both a B.S. in Computer Science and an M.A. in Game Design, he seamlessly blends technical expertise with a flair for creativity. Driven by his passion, he nurtures and hosts developer events in London, delivering sought-after keynote speeches and participating in hackathons worldwide. Apart from his day job, Suyash Joshi is a magician and member of the prestigious Magic Circle in London, crafting performances that ignite wonder, joy, and inspiration. Find him on X: @suyashcjoshi
Dylan Swartz
Product Management Leader, Golioth
Dylan Swartz is a product management leader with a strong foundation as a developer, driven by a passion for creating exceptional developer experiences and building high-performing teams. He specializes in developing products that empower others to innovate and grow their businesses, including platforms, APIs, and tools designed to lower barriers to creativity and progress. Deeply fascinated by the intersection of hardware and software, Dylan is dedicated to exploring innovative ways to simplify hardware through the power of software.