Raspberry Pi Sizes Up HAT+ Spec For Future Hardware Add-ons

The Raspberry Pi project has released the first revision to its Hardware Attached on Top (HAT) spec, along with an update to the RPi 5's PCIE handling tools.

Revealed in a Friday post penned by chief operating officer and hardware lead James Adams, the HAT+ spec [PDF] has four main changes:

  • HAT+ boards must be electrically compatible with the STANDBY power state – where the 5V power rail is powered, but the 3.3V rail is unpowered. The RPi model 4 and 5 support STANDBY state, but older Pis don't;
  • The specification is less prescriptive about HAT physical dimensions;
  • The HAT EEPROM content is much simpler;
  • A special class of HAT+s that can be stacked with an extra HAT+ on top is supported, for a maximum stack of two HATs.

"The original HAT specification was written back in 2014, so it is now very overdue for an update," Adams wrote. "The new specification simplifies certain things, including the required EEPROM contents, and pulls everything into one document in the new Raspberry Pi documentation style, along with adding a few new features."

Adams pointed out that the HAT+ spec is a work in progress, as "our EEPROM utilities haven't yet been updated to support the generation of the new style of EEPROMs."

He therefore characterized the release as "very much for people that want to get a feel for how the HAT standard is changing."

Adams's post also addresses the omission of an M.2 connector from the RPi 5 – a decision he wrote was made on grounds that the hardware "is large, relatively expensive, and would require us to provide a 3.3V, 3A power supply. Together these preclude us offering it in the standard Raspberry Pi form factor."

But the Pi guys have found a way to get an M.2 hooked up to their computers using the PCIe connector, as detailed in a spec [PDF] for the 16-way PCIe connector.

RPi HAT+ M.2 module

RPi HAT+ M.2 module – Click to enlarge

Using PCIe to deliver power to a HAT pipes 3.3V over a single pin of the connector. And that pin delivers power to a forthcoming M.2 HAT+ that Adams wrote "is in the final stage of prototyping, and will be launched early next year."

Adams also pointed out that "HAT+" is written on the RPi 5 board, which he reckons should have given the game away a little earlier than his post.

Why entangle HAT+ and PCIe?

"We really wanted to get the HAT+ standard right, as it's likely to be around for as long as the old HAT standard," Adams explained. "One of the reasons for the delay in getting the PCIe connector standard published was our sense that PCIe boards (PIPs!) that go on top, rather than boards that go beneath, should probably be HAT+ boards. Ours is going to be!"

"Watch this space for the new M.2 HAT+, and a final version of the HAT+ standard, which we'll release alongside it in the new year," his post concludes. ®

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more