š Tjoskar's Blog
I write about stuff, mostly for myself
Use an EāInk display to show information about your house
Iām documenting how I set this up mostly for myself so I can look back at what I did, but hopefully you can take some inspiration for your own project. And I know, I could cut the passepartout better. More on that below.
What Iāve used for this project:
- A Pi Zero W 2
- Waveshare 7.5ā e-Paper HAT
- A IKEA frame
- Two physical buttons
- One LED
- Some solder and a few wires
I got this idea pretty soon after we moved into our house and I set up Home Assistant. I loved the data HA gives you, but I disliked needing my phone every time I wanted to glance at energy use or toggle something. I wanted ambient, glanceable info in the kitchen. I also like the feeling of a physical button instead of tapping on a screen. I didnāt want a bright LCD, but rather EāInk. At first I was thinking about a color EāInk panel, but I couldnāt find any with a low refresh rate (everything I looked at was ~40 s to redraw the screen).
When I started this project I first googling around and found many home dashboards that used a eink display but all of them used a web page, spin up a browser (often headless Chrome), take a screenshot, then pipe that bitmap to the EāInk controller. Clever, but heavy. It looks like this:
- Event: a light turns on
- Headless Chrome process spins up
- Chrome loads a page
- Screenshot captured
- Script forwards image to controller
The latency can be up to a minutet due to startup overhead. Also the error handling becomes complex.
I opted to render directly with Pillow (PIL) on the Pi Zero W 2. It takes ~100 ms to compose the image and then the displayās own refresh time dominates (~1 s). Simpler, faster, fewer components to babysit.
I briefly considered a Pi Pico, but 264 KB RAM felt tight for image buffers and font rendering. So: bigger sibling it is.
Pi Zero setup
Using the Raspberry Pi Imager app I flashed Raspberry Pi OS (32ābit) and entered WiāFi credentials so I could SSH in immediately after first boot.
Important note! I first took the ārecommendedā image and only later realized it included a full GUI desktop. Also: 32ābit is recommended over 64ābit (see https://github.com/raspberrypi/linux/issues/4876 or various forum threads). I spent several hours chasing unstable SSH before plugging in a monitor and discovering a window manager happily eating resources in the background š¤¦
After installation you should be able to SSH into it (find the IP via your router UI or nmap).
Run sudo apt update && sudo apt upgrade -y to pull the latest packages.
Hardware setup
Iām working with the Waveshare 7.5ā e-Paper HAT, an 800Ć480 resolution EāInk display that includes a HAT for Raspberrys.
Display Config Switch: Set to A (3Ī©)
The HAT includes a small DIP switch labeled āDisplay configā, which toggles between 3Ī© (A) and 0.47Ī© (B) resistance settings. This setting adjusts the driving voltage for the E-Ink panel, and must match the specific model of display youāre using.
In my case, the back of the display is labeled V2, which according to Waveshareās official documentation means:
Set āDisplay configā to A (3Ī©)
Setting it incorrectly may result in poor image quality or a non-working display, but itās unlikely to permanently damage anything.
Interface Config Switch: Set to 0 (4-line SPI)
Another switch is labeled āInterface configā, which selects between:
0 = 4-line SPI(uses separate DC line)1 = 3-line SPI(shares DC/data on the same line)
For Raspberry Pi setups, 4-line SPI is the standard and recommended setting:
Set āInterface configā to 0
This configuration works out-of-the-box with Waveshareās official Raspberry Pi libraries. If you accidentally set it to 1 (3-line SPI), the display likely wonāt function at all unless your software explicitly supports that mode.
Connecting the FPC Cable Correctly
The display connects to the HAT using a flat flexible cable (FPC/FFC). FPC stands for Flexible Printed Circuit, and it consists of fine copper traces laminated in plastic, commonly used for displays and cameras.
Insert the cable with the exposed copper traces facing down, toward the PCB.
I first inserted it upsideādown and of course nothing showed and I got some werd error messages that was hard to backtrack.
Connect the Pi with the display
| e-Paper | BCM | Pi Zero Pin |
|---|---|---|
| VCC | VCC | 1 |
| GND | GND | 6 |
| DIN | 10 (MOSI) | 19 |
| CLK | 11 (SCLK) | 23 |
| CS | 8 (CE0) | 24 |
| DC | 25 | 22 |
| RST | 17 | 11 |
| BUSY | 24 | 18 |
| PWR | 18 | 12 |
Software update
Open the Raspberry Pi terminal over SSH and enter the following command in the config interface:
// Choose Interfacing Options -> SPI -> Yes to enable SPI interface
sudo raspi-config
Then reboot your Raspberry Pi:
sudo reboot
Check that dtparam=spi=on was written to /boot/firmware/config.txt
cat /boot/firmware/config.txt | grep dtparam=spi=on
To make sure SPI is not occupied, it is recommended to close other driversā coverage. You can use ls /dev/spi* to check whether SPI is occupied. If the terminal outputs /dev/spidev0.0 and /dev/spidev0.1, SPI is not occupied.
I have no idea what to do if spi is occupied
Create a hello world project
First, install some system packages and clone waveshare/e-Paper to get drivers for the display.
sudo apt update
sudo apt install git python3-pip python3-pil python3-numpy python3-spidev
mkdir ~/hello-world
cd hello-world
git clone https://github.com/waveshare/e-Paper.git # this can take some time
mv e-Paper/RaspberryPi_JetsonNano/python/lib ./
touch lib/__init__.py
touch lib/waveshare_epd/__init__.py
Then create a file called hello-world.py with the following content (for example with vim: vi hello-world.py).
The file contains a few debug logs to see how far execution gets.
from lib.waveshare_epd.epd7in5_V2 import EPD
from PIL import Image, ImageDraw, ImageFont
print(1)
epd = EPD()
print(2)
epd.init()
print(3)
epd.Clear()
print(4)
image = Image.new('1', (epd.width, epd.height), 255)
draw = ImageDraw.Draw(image)
font = ImageFont.load_default()
draw.text((100, 100), "Hello World", font=font, fill=0)
print(5)
epd.display(epd.getbuffer(image))
print(6)
epd.sleep()
print(7)
Now, try it out:
python3 hello-world.py
Set up the frame
I used a picture frame from IKEA and cut the mat a bit so it would fit. Unfortunately I was a bit eager and cut it freehand so it ended up a little crooked. Iām planning to 3D print a custom passepartout on this model.
Then I drilled three holes: two for the buttons and one for an LED. I soldered everything together and then connected it all.
| BCM | Pi Zero Pin | Comment |
|---|---|---|
| 21 | 40 | Button 1 |
| 20 | 38 | Button 2 |
| 2 | 3 | LED |
Set up the application
https://github.com/tjoskar/eink-control-panel
The Python code for this project lives here. Itās very specific to my use cases (tightly coupled to my electricity provider). Youāre welcome to open PRs, but forking and tailoring might be faster. If you do something cool, let me know!
Much of the code was AIāgenerated. I set up the foundation, refined the edges, and let the middle be delightfully machineāwritten. Some parts are a bit messy but it works.
I use an MQTT client to receive and send events to Home Assistant. When a device I care about (engine heater, bike charger, washing machine, dryer) turns on or off, an event is published that the program listens for. On receipt I do a āfast renderā (~1 s refresh including display time). I tried partial updates targeting ~0.4 s but the panel got smeary/ghosty. Probably me missing something and ~1 s is fine for me.
One button sends an MQTT event to HA to start the engine heater. That triggers a small dialog (āStarting the engine heaterā) which autoādismisses after 5 s. When the heater is running (regardless of how it was started) the LED lights. From across the room you can instantly tell. In summer I may repurpose it to show if the bike charger is active or invent a new excuse for a tiny glowing light.
I fetch price and consumption data directly from my electricity provider (I love my provider that have a public GraphQL API for this).
Weather data comes from https://api.openweathermap.org
I also show upcoming garbage collection dates (hardcoded dates).
We plan meals in the iPhone Notes app. I eventually built a Shortcut that POSTs the list to a tiny Deno Deploy service backed by KV storage. The script is intentionally small: parse, clean, store. First iteration below (since then Iāve polished it a tad):
const kv = await Deno.openKv();
Deno.serve(async (req) => {
if (req.method === "GET") {
const entry = await kv.get(["menu"]);
return new Response(JSON.stringify(entry.value || []), {
headers: {
"Content-Type": "application/json",
},
});
}
const body = await req.json();
const content = (body.content as string).replace("Matsedel\n\n", "");
const firstSection = content.split("\n\n")[0];
const lines = firstSection.split("\n");
const list = lines
.values()
.filter((line) => !line.includes("- [x] "))
.map((line) => line.replaceAll("- [ ] ", "").trim())
.take(10)
.toArray();
const result = await kv.set(["menu"], list);
return new Response(result.ok ? "OK" : "Error");
});
What I did spend time on was making it possible to extract the data. The only workable way I found was to create a Shortcut that I run manually. It took a lot of trial and error to get it working, and I donāt have an easy way to share it without sharing my credentials, so ping me if you get stuck and I might be able to help.
Alternatives (a different notes app with an API) would mean convincing my partner to switch and thatās a harder engineering problem. Iāll probably try that at some point.
If you have any questions or comments, feel free to reach out: hello@tjoskar.dev