I have a spare laptop. I’m not going to spend ages cutting it up to make an actual photo-frame, and neither will I build one from scratch with a Raspberry Pi Zero.
I’m just going to put it by the TV and see whether my wife and I look at it.
Requirements:
- Show pictures after power on
- Get pictures from the NAS
- Auto-update them
This is the second in my blogs about this project
Hardware
I replaced a laptop last year, I’ll use the old Celeron-based Acer Aspire. It has a decent 16" 16:9 screen, although the resolution at 1366x768 is actually lower than some of the off-the-shelf photo-frames I could buy.
My eventual plan is to gut this, design a new frame and laser-cut or extrude it in a friendly space.
But before I get carried away looking at activity sensors1 and orientation sensors2, let’s keep things simple.
Software
Right, so to the meat and potatoes.
Sliding a show
Basically, two choices here, do I try for a fancy display that does something like dropping images onto a montage (like AppleTV for example), or just show each image in turn.
One of my goals was to pick something to do the former, but actually the wife says it’s too distracting so instead I’ll just feh
to get the images on the screen in order.
It can pick random files, display their EXIF data, fetch the next one across the netwokr, wait for a period of time, and orient and scale the images.
If I want something better at selecting, with a bit more interest, I can start replacing it.
Selecting the images
Turns out Python database interfaces have gotten a bit better. I’m using peewee - it can generate models based on an existing database, and then you can do something like this:
db = SqliteDatabase(f"file:{db_path}?mode=ro")
FIELDS = [
AlbumRoots.identifier,
Albums.relative_path,
Images.name,
Images.id,
ImageInformation.rating,
ImageInformation.format,
ImageInformation.orientation,
ImageInformation.creation_date,
ImagePositions.latitude_number,
ImagePositions.longitude_number,
]
food_tag = Tags.select(Tags.id).where(Tags.name == "Food").get().id
food_images = (
ImageTags.select(ImageTags.imageid)
.where(ImageTags.tagid == food_tag)
.alias("food_images")
)
query = (
Images.select(*FIELDS)
.join(Albums, on=(Images.album == Albums.id))
.join(AlbumRoots, on=(Albums.album_root == AlbumRoots.id))
.join(ImageInformation, on=(Images.id == ImageInformation.imageid))
.join(ImagePositions, JOIN.LEFT_OUTER, on=(Images.id == ImagePositions.imageid))
.where(
(ImageInformation.rating >= args.rating)
& (
ImageInformation.format.in_(["JPG", "RAW-NEF"])
& ~Images.id.in_(food_images)
)
)
.order_by(ImageInformation.rating.desc())
)
That’s pretty readable. The next bit of code is more boring, and copies the selected images to an area on my NAS the laptop has (read-only) access.
Principle of Least Privilege is a strong thing here (at least, I hope it is).
Operating System
I’m using BunsenLabs to run a lightweight Linux distro. I would prefer something much smaller, on the lines of TinyCore Linux or buildroot but I’ll need to get the actual dependencies picked. Also the wireless card support is a bit of a shitshow.
So, done wrong?
The laptop is slow. I have gigabit wireless and wired networking in the house, but that does need the laptop to be able to keep up with that. Some combination of the driver and the hardware only supporting 802.11b/g means the images still arrive slowly. The NEFs are 13MB which doesn’t help.
Didn’t I have this problem before with the Pi? 😖
So as a quick fix I added an auto-rsync task to the startup of the laptop. The next part of this will be about pre-processing the images to make them smaller and prettier.