My office as entropy

Turning my office's environmental data into verifiable randomness.

I was in the shower around 5am on Labor Day, watching the soapy water swirl and run down the drain. It reminded me of sea foam, and my thoughts drifted from waves to Nazaré before settling on Chaos in Lisbon.

50 wave machines running in unison to produce entropy, used to secure the web.

Form and function.

On January 2nd, 2025 I started tracking the conditions of my home office using a Raspberry Pi 0 W 2 and a Waveshare Environment Sensor HAT (ESH).

SensorMeasurement Range / Specs
TSL25911 Ambient Light0 ~ 88,000 Lux
BME280 Temp/Humidity/Pressure-40 ~ 85 °C (±1 °C); 0 ~ 100 %RH (±3 %RH); 300 ~ 1100 hPa (±1 hPa)
ICM20948 Motion (9-DOF)Accel: ±2/4/8/16 g; Gyro: ±250/500/1000/2000 °/s; Mag: ±4900 µT
LTR390-UV-1 UV280 ~ 430 nm wavelength
SGP40 VOC0 ~ 1,000 ppm ethanol eq.; <10 s response; <60 s startup; on-chip humidity comp.

I’ve been using the values for an ongoing visualization project, which can be viewed here: https://office.pure---internet.com/

I knew right away that this was an opportunity to explore ZK a bit more. The documentation for tooling like Circom and Noir is improving and LLMs help get you up to speed. Armed with a shower thought and a bit of Claude Code and Cursor, I got to work.

I ended up choosing Circom. Noir’s backend agnostic approach was appealing, and I found the docs slightly more approachable, but the Barretenberg prover has issues on ARM systems.

329k rows → single seed

The master secret is the foundation of this system. I processed 329,000+ sensor readings collected since January 2nd, each containing measurements from 6 different environmental sensors (ambient light, temp, humidity, pressure, uv, and voc).

  1. Chunking: Divide the historical data into windows of 100 readings each

  2. Fingerprinting: For each window, extract key statistical features - mean, variance, min/max values across all sensors

  3. Hashing: Use Poseidon hash (ZK-friendly) to create a deterministic fingerprint for each window

  4. Aggregation: Combine all window fingerprints through iterative hashing to produce a single 256-bit master secret

The master secret is unreproducible (you’d need the exact same sensor readings in the exact same order), deterministic (same data always produces the same secret), and privacy-preserving (reveals nothing about the actual sensor values).

The system also looks at frequency patterns in the motion data. Vibrations from trucks, footsteps, or the washing machine create unique signatures that are nearly impossible to replicate.

Generation combines the master secret (hidden), a user-provided seed (public), and a timestamp to ensure unique output each time.

The result is a ZK proof that says “this randomness came from real sensor data and was processed correctly” without revealing the details of my office. Even if you use the same seed multiple times, you’ll get different randomness because my office environment is constantly changing.

Building the proof

The proof makes four promises: I know the master secret without revealing it, I used real sensor data without exposing the readings, I processed everything through the correct algorithm, and nothing got tampered with. A digital wax seal.

The secret stuff — sensor readings, master secret, intermediate calculations — stays locked away. The public stuff — seed, timestamp, final random number, sensor data fingerprint — is in the open for anyone to verify.

Keeping receipts

These proofs are small, about 2KB each. I use IPFS through Pinata to store them, and when a proof is generated and uploaded the CID and random number are stored onchain.

The whole thing lives at 0xCf5Ea3Acb389b8a89935BD542273290F05f3054D on Base Sepolia testnet.

What’s next

Everything works. You can go to the site and generate verifiable randomness powered by whatever’s happening in my office at that moment. The full pipeline from sensors to ZK proof to blockchain is running and has processed hundreds of proofs.

I keep thinking about a network of sensor operators contributing entropy to a shared pool. Cross-chain bridges for randomness on other blockchains. Batching proofs together, recursive proofs to compress things further. Maybe someone defines standards for environmental randomness beacons.

Solutions like LavaRand and Chainlink VRF already provide APIs at scale. Dust particles, air currents, and temperature fluctuations as a source of randomness feels different though. Entropy is everywhere.

Whatever someone comes up with next will be random in its own right.