A Home Assistant + MQTT integration for Pulse Hubs

I just got the pulsegrow-pro a couple of days ago, and just starting to play with it. As for experience with Python, I have taught sections of NASA Goddard’s Python Bootcamp in the past. That said, 4 or 5 of the past 6 years was spent working on the air-gapped GOES-R ground system, maintaining the L2 science products. So, much of my Python is a little out of date, but I can catch up quickly when sufficiently motivated :wink:

In most of my work I was forbidden to use AI tools, so those are mostly new to me. As we move this along, I might ask for some suggestions on how to get them to work without seriously hallucinating – I poked at them a few times at home to see what they could do, and found it was hallucinating so badly that the results were decent suggestions at best.

I’ll try to set things up to collect the endpoints and other device info. This will take me a little while, as I am working on a few other things (like job applications after basically getting DOGE’d).

This looks fun!

1 Like

Sorry to hear about your (presumably) STEM work being impacted. A close friend of mine is a director for a national STEM education thing and her org has been similarly impacted in deep ways. Your Python bootcamp work is awesome. I have no idea how to get into that sort of work, but it sounds extremely rewarding and fun. My lack of a 4 year degree can hold me back into those careers typically held by academics. My learning has been non-traditional/teaching myself while being tossed in the deep end at various jobs over the last 2 decades. I’ve been looking for a change to something more ā€œmeaningfulā€ like that, just very slowly since I’m relatively happy with my current role and have plenty to do.

Given you sound good on the Python front, I assume the only thing you might run into friction would be the Poetry stuff, because Python packaging remains a nightmare, as I’m sure it was when you last interacted with it. Or some of the abstract AppDaemon shenanigans, I found confusing while learning. If you run into any annoying blockers there, please don’t hesitate to ping me.

The AI stuff, I don’t use it heavily here but it can be useful. I use it as a productivity enhancer and to teach myself new topics faster than I could in the days without good LLMs. I tend to feed way more context in my prompts than I see others do, which might be how I get more effective results. I hate not being able to articulate better how I get results, as it makes it feel magical when I’m sure it’s not. But I can share some actual prompts I’ve used before on things I’m working on as an example, via email or something.

most of my work I was forbidden to use AI tools, so those are mostly new to me

I did hear that Anthropic released a Claude model that has been approved for work in US classified environments on national security workloads. It sounds like times may be slowly changing there.

In the case of this project, I feed a lot of context to ChatGPT’s Codex tool using the AGENTS.md. It’s the first thing Codex will look to for initial context and instructions for this. And behind the scenes in OpenAI’s configuration for Codex, I configure secrets and other stuff in its runtime environment variables. That way, it can do things for me like I asked you for earlier. I can say, ā€œI’ve armed you with the needed info to connect to the Pulse API, now connect to these endpoints I’m going to paste, using my device info, and make a PR with the captured JSON output from each, in a way that is consistent with the existing mock JSON files.ā€ And it was able to do that without any further exposition from me.

Now that I have a decent, working base on this project, Codex can be really helpful doing major refactoring without breaking the program. I’ve found more abstract or cute ways of doing things as I reach the edges of this or that engineering approach in AppDaemon’s framework, and explaining to Codex this is how I do things now and here’s a quick sketch out of how I want things to be refactored, it does a pretty good job filling in the blanks close to how I would’ve. Then I can just pull its branch, do some fixups, squash some commits, and I have a solid refactor.

Hopefully that gives some helpful perspective. Feel free to follow up with any questions, of course.

1 Like

My company has been really pushing the use of AI tools. I’ve been using it to for test case/unit test generation, small bug fixes, and refactoring. I can be a real time saver for small stuff. Recently for personal use I’ve been trying to learn React using Cursor AI. I’ve managed to build a really fun front end that displays all my data from Pulse devices. I tried to mimic the design of the Apollo command module from NASA. Its powered off a python server that fetches data every minute, stores it in a database, and provides an API for a front end.

3 Likes

I love that! I’ve been a fan of Apollo era stuff ever since I saw Apollo 13 as a child. In 2016, I was able to see the Saturn V rocket in person they have on display at Johnson Space Center in Houston and get in the Apollo 17 command module, among other neat experiences. No doubt @EBo has much cooler anecdotes on the NASA front, though.

I’ve heard Cursor mentioned by some coworkers but haven’t tried it yet. It’s on my list. I only just started making use of Codex and running Ollama and Open WebUI locally.

1 Like

@terpasaurus.midwest Finally had a chance to look into this thread today.

In regards to the api stuff, I gave it a brief glance and it appears to be fine.
I don’t recall us having any complaints either.

However, I’m more than eager to accommodate any requested changes / double check for any issues.
If you have the time - please dm or email us with more details on what’s giving you trouble and I’ll have it sorted.

Also - if you need any new api endpoints, i can usually add them in pretty quickly.

2 Likes

Thanks for the reply. I haven’t had a chance yet to follow-up to reproduce the issue. I’ll do that this week and get back to you via email.

When it comes to new API endpoints, I have some unsolicited suggestions:

  • A single endpoint to fetch recent data of all devices and hubs+sensors would be really nice. Similar to the all-devices call the webapp uses.
  • A single endpoint to fetch all sensors recent data. For when all-devices isn’t needed
  • A single endpoint to fetch all devices recent data (pulse one, pro, zero). Again for when we don’t need everything.
  • A force-read endpoint for sensors would be great. Similar to the ā€œRapidā€ button that was graciously added.
  • ability to edit and enable/disable alert thresholds for devices and sensors
  • An endpoint for creating/editing OpenSprinker automation programs

Thank you for the feedback.
i can definitely add in the endpoints for getting data.
I should be able to add that pretty soon.

As for disabling the alert thresholds and editing the sprinkler programs - This is something that I can’t promise. Our public Api uses api-keys and it’s considered a bad practice to use them for anything else than fetching data.
Irrigation comes with a lot of responsibility and we’d like to avoid any risks here.

whoa, wait, where is this ā€œrapidā€ button?

It’s under the chart where the share and export buttons are.
Only for hub sensors atm.

Hi there, I added two endpoints to the public api.
force-read for the force-read.
And All-Devices for getting the all device data including their most recent datapoint.
The Hubs, Sensors and Ones/ Zeroes/ Pros are all separated in the response, so I didn’t see much sense adding more endpoints. (this won’t affect performance since it’s all cached)

1 Like

Hey everyone,

first of all thank you @monstermash for providing this integration. I successfully installed you AD app. It was a bit fiddeling around as I am not familiar with AppDaemon but in the end I have it running and get Data from my Pulsehub Sensors.

I am in the middle of creating my Dashboard in Lovelace. I would love to display a statistics-graph card with chosen sensor Data but here comes the stumbling block.

I cannot chose any sensor added over your AD app. Somehow HA tells me ā€œNo statistics foundā€. The funny part is when I inspect the sensor properties over MQTT i can see the whole timeline.

Maybe you have a hint for me ?

1 Like

I’m not even sure I’m the first one to request this, but either way all the credit goes to @Ggofman and the rest of the team for making it happen.

@blackhatgarden i made the AppDaemon integration. I think I know why you’re having the issue. I’ll look into it today and follow up.

I made this when I knew less about Home Assistant and I think I didn’t set the measurement type or something silly that the statistics thingy expects on the measurements wanting to opt-in for statistics

1 Like

Ohh I mixed the names up :sweat_smile:

shout out to @terpasaurus.midwest for this work :wink:

Just in case someone else is stumbling upon this. If you want to display some metrics with Homeassistant ā€œStatistics graph cardā€ in Lovelace you have to customize the Sensors with the Attribute
state_class: measurement.

Otherwise their will be no statistics.

Easiest way to achieve this is editing Homeassistants configuration.yaml. As described in this Thread:

Sensors not showing in statistics diagram card - #8 by tom_l - Configuration - Home Assistant Community

1 Like

lol yes exactly, the ā€œmeasurementā€ state_class you mentioned is the vague memory I had from months ago when I did this :rofl:

I should be able to make the integration set this for you. This was something I did locally but since no one ever asked about this later I never bothered to do anything about it

1 Like

Would definitely make things more beginner friendly. But do as you please, for me it works well with editing the configuration.yaml :grin:

My bad if I misrepresented anything.

No worries, I appreciate knowing that someone else is getting use out of this integration. I made it for myself back in July or August and it has been silently working without issues while I’ve been incredibly busy on my actual day job and neglecting my grow and esp32 work.

I planned to come back and make this integration installable via HACS but never took the time. I’ll try to circle back and give this project some love again soon and maybe do that.

@monstermash no worries I didn’t assume it was intentional. I just didn’t want it to seem like it was officially endorsed, supported or created by the Pulse folks since they won’t want to deal with support emails asking about something I broke with it :sweat_smile:

Also @blackhatgarden if you can clarify any friction points you had in getting things setup the first time, I can update the readme if needed to make it easier for others in the future. Lmk. But if I make it installable via HACS I assume that would be simpler for everyone and make it easier to update/distribute updates.

2 Likes

Sure if I find time I will write down some points. If I remember correctly it was only related to AppDaemon as I never used it before. So finding the right Folders and messing with configuration to load the missing plugin.MQTT for AppDaemon.

I can imagine that a HACS installable version would clearly help, as there are supposed to be people who are not familiar with yaml, shell or reading logfiles. :wink: