Ubiquiti SuperLink and USL Environmental Sensors

First off, lets start with a warning; If you order your SuperLink and Sensors from the Ubiquiti US store, it’s possible that you may get one of the units or in our case, all the sensors as the EU model and then a US SuperLink and well… they don’t work together. You can identify if you end up with an EU sensor or device via a small sticker on the box like this;

But once you get the right sensors – Kudos to UBNT for overnighting us replacements – they work very quickly, just pull the battery tab and they come up. So far so good, they’ve been pretty reliable connection wise for us and we’ve been testing them through concrete filled CMU w/ steel and we still have a reliable connection with reliable monitoring:

Overview: What is SuperLink?

The SuperLink platform from Ubiquiti is a new wireless sensor protocol & gateway ecosystem designed to integrate with the UniFi OS / UniFi Protect environment and deliver IoT sensor connectivity with enterprise-grade range, latency, and battery longevity.

Key technical highlights

  • SuperLink is designed for multi-kilometre line-of-sight range, enabling large-scale deployments (industrial, commercial, smart-buildings) rather than just short-range BLE sensors.
  • Ultra-low latency communications, tailored for security / alarm / automation sensor use-cases.
  • Efficient power management: supports long battery life endpoints (key for sensors in remote/undisturbed locations).
  • Integrated into UniFi OS: the gateway is adopted into UniFi Protect, which means your existing UniFi-based infrastructure (if you have one) can leverage the sensors and gateway.
  • For deployments: the gateway supports dual radios – Bluetooth (for legacy BLE sensors) plus the proprietary sub-GHz SuperLink radio for the new sensors.
  • In short: if you are managing facilities (data center racks, network closets, remote MSP sites) and you need environment-monitoring (temperature, humidity, water leak, light) with wide coverage and minimal wiring, the SuperLink + USL-Environmental combo offers an interesting path.

Tucson, Data Centers and the AI “Bubble”

I’m the Vice President of Operations for a Data Center Provider in Tucson – It’s easy to find out which one, I just keep them purposefully separate – and we’ve had a lot talk lately within the community about Data Centers and the resource consumption they demand, specifically in Tucson around Project Blue.

One of the big demands of some facilities is water, if they’re going to be doing GPU workloads or just intense workloads and turn to child water ( they should… it’s very efficient at scale ) the initial proposal was for an open loop water cooled chiller plant. Because of the low humidity in Arizona, and the initial low cost to implement such a system it would make sense to go open loop but there’s one issue…. water consumption. Now, given that we are in the desert, and if you had done any research in the city, you’d know that you are not going to have a warm reception to high water usage. They did not do that and so the community rejected the offer.

Closed loop cooling is more expensive up front and not quite as efficient but they do last longer and only require small amounts of makeup water after fill.

Finally you can do chilled water or glycol with a refrigerant based system, again just sticking with fluid type systems and not direct expansion systems.

We use direct expansion right now as our critical load is only 250kW at the moment, and that’s not enough to build out more than 1 chiller from a cost perspective. We will install a glycol loop for GPU workloads and direct to chip cooling before convert our DX units.

All this to say, being careful of your surroundings and environment makes a big difference in what type of headwinds you face in a given project. Perhaps if the team behind it had more than 6 months of experience working together and spent some time in the community before the initial design they could have known this. Furthermore the power load, some 650MW is more than the city currently consumes. A consumption that is already hare for the local utility to provide reliable power to, we have hundreds of power monitors across southern AZ and it’s not great. Additionally we ourselves have to routinely go to generator in unplanned events every year due to utility failures, in the city. I don’t think that it’s rational to think that the local utility was going to be able to double their power supply in 2 years without community impact.

The AI boom is not a bubble – I don’t think, but it will be interesting to see how much capital comes calling back and how soon, there’s a lot hanging out there right now and I’m not sure what they are going to reasonably do for power. Microsoft bought a nuke… others are using gas turbines on site through various loopholes. It’s interesting… We have been looking at the same requirements and we are going to install about 300kW of solar in the city to offset some load. We have about 250kW of critical power shell available right now but hopefully in the near future we will have about 10MW available but, hopefully, in a sustainable manner.