• Week 16.26

    Week 16.26

    We attended my aunt’s funeral on Tuesday. My complaints about the Mandai Crematorium mostly still stand, but they’ve at least moved the ugly signs printed on office paper away from the viewing windows so you can see the casket on its way to the… furnace?

    As I said last week, she was 93 and the family was mostly prepared for this. But there were tears, and some meaningful words were said, and despite my irritation with the undignified air of the Crematorium’s processes, I was struck at a mostly subconscious level with a sense of loss. Because a couple of days later I was thinking about orchids.

    Since I was a child, I’ve known orchids to be a part of my family’s story. My paternal grandparents were enthusiastic orchid breeders as well as co-founders of the Mandai Orchid Garden, where they helped raise the profile of Singapore’s orchids at home and abroad. I was surprised to learn while writing this that orchids are still an instrument of Singaporean diplomacy. Although I never had any interest in them myself, my late grandmother is defined in my memory by her fondness of them, and several other relatives (including the aunt who just passed) had hybrids named after them, created by my grandfather.

    As mentioned last week, I have been experimenting with generative art and it entered my mind that I could try to simulate orchids — creating infinitely unique flowers and plants in code. Now, this is nothing new. Humans have been trying to reproduce natural processes like botany with algorithms almost as long as we’ve had computers. But the more I thought about bringing millions of digital orchids to life, the more I thought about where they would go after. To create a beginning is to guarantee an end. The result is a digital artwork I’ve called Orchids, Once. and it’s a sort of meditation on impermanence.

    You can summon a new orchid into existence, but know that you’ll be the only one who ever sees it. When you leave or reload the page, it’ll be gone. Does the fact that there are potentially billions more make it less special? Or that it cost nothing? Or that it’s not technically “alive”? In any case, I hope people will cherish the brief amount of time they spend with each flower. I didn’t design a “retry” or “new orchid” button because the responsibility of ending a session should rest with the viewer.

    Orchids, Once. also stems from the generative music experience I gained while making DataDeck, and features an ambient soundtrack that’s created in real time as the orchids turn and sway in the digital wind, as unique and unrepeatable as the flowers themselves.

    I had to work with both Gemini and Claude to get this thing in shape. I didn’t save enough screenshots of the development process, but here are two from the prototyping phase that AI would have you believe were good enough to ship, and that look like orchids.

    Many hours of refinement later and I had models that could pass for plants, but had a nasty habit of growing backwards into themselves, or occasionally mutating into unholy jagged messes. I thought they were finally getting somewhere, but then we took a trip to a plant nursery nearby for a little field research. I spent some time looking at dozens of real orchids and taking pictures, and came home with lots of changes to make. I have learnt more about orchid anatomy this week than I had from decades of being in an orchid-breeding family.

    I also can’t help but reflect on the past few weeks of making things in code with AI — this only started on March 1, but it feels like months ago. Orchids, Once. is my 10th “app” (but the 9th released).

    The first few toyed with pulling data from online sources: Collagen pulled album art from iTunes, Urban Jungles pulled weather data from Open-Meteo, SkySpotter pulled air traffic data from OpenSky.

    Then the next few pulled data from online sources and tried to make something new out of them: Library Supercollider mashed up texts from Project Gutenberg, CommonVerse let you play with words from a dictionary, DataDeck generated music from public Singapore data feeds, and Crumbs let you build your own “maps” with location data.

    The most recent ones? They’ve been about generating their own assets out of nothing, without drawing on external data: the GenArt wallpaper/image maker I’m still working on, daily 3D mazes to escape from, and these orchids. These shifts weren’t conscious or planned, but it’s curious to look back and notice it.

    I’ll stop at 10 for a while, and maybe pick things up again after I get back from my holiday.


    One bit of housekeeping: I found the time to revisit my first app, Collagen, and make some improvements I’ve been wanting to see for a while. You can now use images in different aspect ratios, not just squares. And each image can be zoomed and cropped really easily with a new editing overlay. You no longer lose images if you change the grid size, text cells can be edited, and the UI has been given a mild glow up. I feel like I’ve learnt a lot since then, and this v2.0 brings things up to date.


    Media activity

    My book club finally finished reading Michael Crichton’s Sphere and I gave it three stars on Goodreads. In the end, my vague recollections from reading it as a teenager mostly held, although a slightly racist and sexist worldview permeates the text, and I’m sensitive to how much that would not fly today. I’m eager to see how the film adaptation handles that when we watch it together next week, as it was made a decade later.

    The second season of The Pitt ended after 15 episodes and damn I’m going to miss it. This is a show that alerts me to how ignorant I am of certain (most?) social dynamics and other signs people tend to give off.

    Speaking of the series in general so I hope this doesn’t spoil anything for anyone, but suicidal ideation is a recurring theme that I didn’t take very seriously — which is the whole point of the show’s handling of it.

    I go on Threads after every week’s episode to read people’s takes and interpretations, and I’m always learning something. This week some people got mad that men don’t take this suicide stuff seriously, or can’t see it at all and can’t talk to their friends, and I guess I’m a little guilty of that. I didn’t know the character on the show was thaaaat serious, and thought “eh, they’ll walk it off. It’s no big deal, everyone imagines it sometimes.” Apparently not.

    Unintentional death theme continuing: I watched a Japanese film on MUBI: Super Happy Forever (2024). It’s about a widower who goes back to the seaside town where he and his wife met on holiday. It jumps back and forth in time and does a few other things that should yield more emotional impact than it does. I wrote on Letterboxd: I think the ingredients of a proper 4-star movie, the kind you rewatch every five years, are here but not properly assembled. Nairu Yamamoto is so lovely, so magnetic in all of her scenes that she redeems her supremely annoying partner like the best of people do. Shame.


  • Orchids, Once.

    Orchids, Once.

    View the digital artwork at https://orchidsonce.xyz


    Almost every orchid you’ve ever seen was intentionally bred — a slow accumulation of crossings, selections, and genetic accidents that produced something new. This is the same process, compressed into a digital instant. Every visit generates a unique specimen: structure, colors, and proportions assembled from code the way a real orchid is assembled from DNA. No two will ever be alike.

    As it turns in the light, you’ll hear music shaped by the flower’s appearance — the soundtrack itself is a one-time miracle, as unique as the visuals on your screen. Its presence completes the meditation.

    When you close the window, the orchid dies. There is no save state, no gallery, no record of what you saw. Each plant lives only as long as you stay. If you weren’t there, it wouldn’t exist at all.

    There is always another one waiting to grow — but not that one. Never again that one.


    Disclaimer: I made Orchids, Once. with the help of Gemini and Claude LLMs, and take no responsibility for any allergies or other harms.

    Related blog post: Week 16.26


  • a maze, a maze, a maze…

    a maze, a maze, a maze…


    Play a maze, a maze, a maze… at amaze3.app


    Every day, a new maze appears. Everyone in the world gets the same one.

    There’s something cozy and comforting about knowing that right now, somewhere, another person is navigating the same corridors, hitting the same dead ends, and having the same moment of doubt about whether they just walked in a complete circle. Some days the maze is generous and you are out in twenty seconds. Other days it will make you work for it, and you will feel the exit before you see it.

    Each maze has a target time based on the shortest possible path. Finish close to it and you’ll earn an S-rank celebration and a shareable stats message. Go slower and you’ll land somewhere between a laudable A and a sad D — either way, there is always the group chat to prove you showed up and tried.

    Three modes: Standard comes with breadcrumbs showing where you have been; Hard Mode removes them and trusts you to hold the map entirely in your head; Chill Mode turns the timer off for people who just want to wander. Themes range from an outdoor garden maze to a retro game dungeon, so you can get lost in a way that feels right for you.

    A new one tomorrow. And the day after. A maze, a maze, a maze.


    Disclaimer: I made a maze, a maze, a maze… with the help of Google’s Gemini 3 Pro LLM. No responsibility taken for wrong turns or damaged self-esteem.

    Related blog post: Week 15.26


  • Week 15.26

    Week 15.26

    I’m looking through my camera roll to remember what happened this week and it’s mostly a bunch of “artworks” I’ve been making. Wait, let me step back: I’ve had an interest in procedurally generated graphics (GenArt) for awhile, and it peaked with the NFT boom of 2021–22, where I spent a relatively obscene amount of money minting and collecting artworks I really liked (not the monkeys). I’m mostly drawn to the idea of mathematically rigid routines producing organic beauty — the contrasts in that, and the unpredictability of what you get when you roll the RNG dice.

    So after my recent experiments in making apps, I wondered if I could get AI to write me code that would generate images based on concepts I described. The answer is, of course, yes! It’s important to note this isn’t prompting for images (like when you use Midjourney or DALL-E), it’s prompting for the math behind making images. And once you’ve created the rules by which it draws different art styles, you can create a nearly infinite number of unique artworks by dialing different variables up and down.

    One example is a “style” I made called Labyrinth, which produces actual, solvable mazes. Depending on the variables you adjust, you can make mazes ranging from tiny to massive, with just one solution, or many. If you asked an image generation AI to draw a maze, it would likely lack the coherence of a real maze, because of the way it operates — focusing on the superficial appearance and not the integrity of its paths. But an AI model can make the math to draw a maze.

    I start most of these by thinking up an artistic production approach, say “take sheets of colored cardboard or acrylic, and punch holes of varying shapes into them, then layer them on top of each other so the holes line up (or not), and randomly spray contrast-colored paint on some of them”. Then I describe the possible variations and variables I want to control to the AI, such as the density of shapes, the thickness of the borders, the ratio between angular and organic lines, and we iterate after seeing some of the results. Just think of all the methods and ideas you might want to play with, and how this lets any old idiot model them on their computers!

    The meta project is that I’ve made a modular app that handles all these different styles for me, whether they require a 2D canvas or WebGL. The app provides a common UI layer that all “styles” can plug into, which allows me to control them. Now that it’s done, I can just focus on experimenting and having fun making new artworks. I daresay a few of these are executed as well as any of those I spent money on.

    I’ll probably release it as a wallpaper generator once I have enough styles built in, if anyone’s interested. But mostly I love having this as a background project that I can dip into, on and off. It allows me to take on other app ideas as momentary “side quests”.

    While making Labyrinth, I showed a maze to Cong, who said “You should do a puzzle maker”. To which I said, “Nah.” And then a minute later… “Although, a daily maze game. Hmm.” It made sense that I could save time by taking CommonVerse’s daily random generation mechanic and combining it with Labyrinth’s logic to make a daily maze challenge. But would it even be fun to trace a 2D maze with your finger and try to solve it? No… so what if it was a 3D maze you had to escape?

    The first prototype took a couple of hours, and I’ve been polishing it for the last few days. I think it’s coming along nicely. I’ll put it out soon, once I balance the difficulty and get more feedback from testing.

    The development of a maze, a maze, a maze… was hampered by a rare bar crawl with Howard and Jussi on Thursday night that gave me a massive hangover lasting into Friday afternoon. When I got home, I was too plastered to care that my vinyl copy of J Dilla’s Donuts had arrived from Amazon US protected by nothing more than a flimsy paper envelope. By the clear light of day I was amazed that they would even do such a thing. The discs are intact, but the sleeve has a bent corner. If I’d ordered from Amazon Japan, I would bet a major internal organ that it would come wrapped in four layers of stiff cardboard, bubble wrap, and a handwritten apology for their carelessness.

    Did I mention we’re going to Japan again? It’ll be a short vacation, in a couple of weeks’ time. Not much on the agenda, just checking in on the state of curry rice and egg sandwiches. Maybe see some nice art. Take some photos.

    Which brings me to the latest betas of Halide MkIII, which I’m very much looking forward to using on the trip. They’ve been progressing the app nicely, and it might be enabling the Holy Grail of iPhone photography workflows for me. Ironically it involves using Halide not as a camera app, but just as a photo editor. You can shoot compact (lossy, JPEG-XL compressed) ProRAW photos up to 48mp with the default camera app, then edit them in Halide to have the same look as their Process Zero photos! What this means: you get all the benefits of computational photography at time of capture, including noise reduction and night mode, but you’re also free to dial it back and get natural, “real camera” photos in post if the scene calls for it.

    As much as I like these side quests, I think making my own photo editor would be biting off entirely too much to chew, so I’m still rooting for these guys to crack it.

    While writing this post, I got the news that an elderly aunt passed away at the age of 93. She had been in reduced health since the Covid years, but by all accounts she went very peacefully and I guess you can’t ask for much more than that after a long life. The extended family’s Chinese New Year routines fell apart in recent years after she pulled back from organizing them, so it was fitting that some of us got to reconnect at her wake on Sunday evening.

    See you next week.


  • Week 14.26

    Week 14.26

    An update on my app addiction

    On Wednesday morning I woke up and saw that my last app DataDeck was getting a bunch of likes and reposts on Bluesky, which was a nice surprise. If ever there was a place where people would appreciate a wacky, nerdy idea, I guess that would be it.

    My Instagram Story on Wednesday

    I made a couple of post-release updates to my magnetic poetry non-game, CommonVerse. There are now two new themes, one called Label Maker that resembles those little Dymo stickers we used to make, and another called Zine which is like a random note of cutout words. The UX has also been improved in subtle ways that might make it easier to manage making sentences.

    My “main” app project now is one that I can keep noodling on in the background, with no real endpoint — it’s done when I think it’s done — and the idea was that would help me slow down and spend less time with this vibe coding stuff. Guess what happened? That’s right, if you design something that can sit on the back burner, it will sit on the back burner. I started work on another app instead.

    Defying time and gravity

    I’ve known that the next step was to play with agentic coding tools like Codex or Google’s Antigravity. These are code editors with integrated AI that can look across all your project files and manage multiple agents working on simultaneous tasks. It’s a far cry from the way I’d been working: getting advice and instructions from a single chat, and then doing everything myself in a code editor. So I finally got started with Antigravity, and it blew my mind.

    The productivity increase is hard to describe. I could just describe stuff and it would get done without further work on my part. The tool can use the system’s terminal and Chrome browser to install packages, click around and test the app, figure out why things aren’t working, and fix it while you watch. Stuff that took me days over the last month could have been done in hours. It was automating so much of what little I, the non-programming human, was doing and considered my job, that it made me feel kinda redundant, to say nothing of real programmers.

    With Antigravity, the MVP of my app concept was done in three hours on a Friday. The good/bad news was that it blew through most of my token allocation for the week. So I went back to the “old” way of working and made subsequent changes manually. What I discovered was that I much prefer getting hands on with the project files, looking through the code to understand what was going on and what went where. I think I’ll use these agentic tools to get started fast and figure out a working architecture. After that, it’s more fun to get involved and make improvements slowly.

    Ate and left the Crumbs

    So the new app is called Crumbs, as in breadcrumbs, as in leaving a trail of them so you know where you’ve been. It’s a private location journal that lets you mark where you are on a map with a single button push. Over time, you can see the path of your journey(s).

    I made this because I’ve always wanted something like this for logging holidays, and no app really does what I want. Foursquare’s Swarm is based on Places, so you have to find the business listing or entry in order to check in. If you’re in the middle of a national park, or in a country where no one has created Places, or you can’t read the names, you’re out of luck. Google Maps has a Timeline, but it tracks your location all the time, and it only shows your trail on a day-by-day basis. Your data is also locked in their app and you can’t get it out to visualize in other ways.

    Crumbs is private, and you can take the data out in JSON format. It logs the time and weather along with your location, and you can write little notes. You can save an image of your map, or export a PDF of your journal.

    A big breakthrough (for me)

    Unfortunately, because it’s a web app and not a native iOS app, it can’t permanently store data on your device. The OS may decide to purge all your data if you haven’t used it in a week. That’s a dealbreaker for any app intended to be a life-logging tool. That really bummed me out, and I thought it would just have to be a personal tool that I couldn’t distribute to anyone else — since remembering to do manual backups/restores of the JSON file would be a massive PITA for any user.

    And then I had a Eureka moment! I thought of a possible solution and asked Gemini if it was feasible, to which it answered “Yes, this is an ideal solution”. I wanted to scream “Well, then why didn’t you suggest it all this time we’ve been discussing how to get around the problem!?”

    The answer was Dropbox integration. I can’t make a web app read/write files locally, but I can do it in the cloud. So now Crumbs is as useful as a “real app”, provided you connect a Dropbox account.

    As of Monday morning this post is late and I think Crumbs is ready, so here it is.


    Other thoughts

    • Here’s a free idea: I was inspired by this stamp journal that went semi-viral, and wanted to make some sort of digital Instax photo album. It’d be kinda nice to keep a virtual scrapbook of interesting images, right? Well, turns out you can just use Apple’s Freeform app and Dazz Cam. It’s as simple as making a board and dropping in images, then arranging them however you want. All stored locally and synced to iCloud, easy peasy. Just because you can vibe code it doesn’t mean you should.
    • My iPhone’s MOFT Snap Case developed a cut/tear in its faux leather surface, and so had to be replaced after just six months. Its replacement is a Caudabe Sheath, which fits my requirements of being neither silicone nor slippery, with full edge coverage and a Camera Control passthrough button. It’s a hard plastic material with a rough, pebbled texture that makes it feel secure when held. It also came in second in MobileReviewsEh’s roundup of the year’s best cases. I got the version with the ‘open’ cutout for the 17 Pro Max’s camera island, not the ‘precise’ covered design.
    • Kim managed to finish reading Project Hail Mary and we went to see the film on Sunday (non-IMAX). Apparently there’s a longer cut, nearly four hours, which will be released on streaming in August when it comes to Amazon Prime Video. Yes, this is billed as an Amazon original film from the very first frame, coming even before the MGM logo (which they own), and I don’t think that will ever stop being weird. The film is good, a mostly faithful adaptation of a fun but slightly flawed book. I just think they glossed over a lot of detail in the final act, which lowered the stakes and made it less exciting and rewarding than it could have been. Hopefully the extended cut’s extra run time is concentrated at the end.

  • Crumbs

    Crumbs

    A location journal that’s actually yours.

    Try it at CrumbsMap.vercel.app


    Most map apps are for navigation, not remembering. You know those sequences in old films like Indiana Jones where a dotted line traces across a map from city to city? That is what Crumbs does, except it is your life and the dots are places you actually went.

    Effortless location logging

    The idea is simple: press a button to log where you are, write a note if you feel like it, and watch your trail build across the map. No passive background tracking, no accounts, no selling your movements for ad targeting. Just the places you chose to remember, connected by a line, yours to keep.

    Most travel apps get this wrong in one direction or another. Google Maps’ Timeline tracks you constantly whether you want to remember or not. Swarm needs a business listing to exist before you can check in. Neither lets you draw a line across a whole week, or a custom trip length, and export it cleanly. Crumbs does all of that, and stores everything locally on your device.

    Do things with your data

    There is a list view that lays out your stops like a journal with timestamps, weather, and location metadata, exportable as a PDF keepsake. You can also save a clean image of your map at any point, ready for sharing or scrapbooking.

    Crumbs is a PWA (Progressive Web App) which means mobile operating systems may occasionally purge its local data if not used in awhile. However, connect your Dropbox account and we’ll sync with the cloud automatically, so you won’t lose a crumb. If Dropbox isn’t your thing, manual JSON export and import are available for backups. Either way, your data is yours to keep and use freely. Vibe code an app to generate custom posters, for example.

    If you want native background tracking that runs without you thinking about it, I recommend Where Now? — a free indie app by Scott Boms that also logs your location privately. Crumbs can import Where Now’s data exports so you get the map trails and other features. Best of both worlds.

    Other details:

    • Pins capture location, date/time, city/country, and current weather conditions from Open-Meteo.
    • Works offline: Pin your location while off the grid, and Crumbs will show it on the map when you’re back online.
    • Filter map and list views by Today, This Week, This Month, All Time, or a custom date range to see only specific trips.
    • Uses standard Plus Codes as a shorthand for geolocation, so PDF exports retain all relevant information in a human-friendly form.
    • Open any pin location in Google Maps for more detail on nearby places of interest that were involved in your moment.
    • Minimal, glassy UI that “puts the focus on your content™”.
    • Bread mode: replaces all red pushpins with baked goods.
    • Red string trails can be disabled in settings.
    • Pins can be moved via drag and drop if necessary.

    Disclaimer: I made Crumbs with the help of Google’s Antigravity and Gemini 3.1 Pro. Your location data stays on your device. I have no idea where you’ve been.

    Related blog post: Week 14.26


  • Week 13.26

    Week 13.26

    I finished my sixth app: DataDeck. It simulates a fictional hardware music player called the DataDeck SG-01, or more accurately, a music generator. It reads live, open data feeds from the Singapore government’s data.gov.sg portal and translates them into unique musical compositions.

    My first prototype ingested the tourism stats for International Visitor Arrivals to Singapore since 2008, and when I first experienced the silence of the Covid years, with the beat gradually building back up again after 2022, I knew I was on to something. Data sonification is a cool term for nerds, but hearing the stories stored in the numbers is something anyone can understand and appreciate.

    At about ten days of development time, it’s the biggest project I’ve delivered so far with the help of AI — there’s no saying how long it would have taken me to do on my own. A million years? Instead, in just 10 days: parsers for 10 different datasets, 10 varied musical styles, and 10 switchable themes.

    The inspiration for its interface was the kind of hardware devices my dad had in the 70s and 80s: calculators, microcomputers, and tape decks from companies like Braun, Sharp, Sony, and Texas Instruments. A sort of Rams-ian, Bauhaus-ish modernist school of industrial design. The different color schemes you can choose from evoke specific brands or devices, like Apple’s Snow White-era or the original Nintendo Game Boy (DMG-01) and the Roland TR-808. I especially enjoyed working within the constraints of an imagined hardware UI, so when you switch to a dataset mapped to Singapore’s physical geography, the drum pad buttons get remapped to move a reticle around the map. It makes it feel more real, imo.

    The idea of playing with procedurally generated music using software-synthesized Web Audio was probably seeded years ago when I collected the 0xmusic series of art NFTs, which generated endless musical sequences from code on the Ethereum blockchain. I dare say that DataDeck is more advanced, and with better sounding musical output than those. Plus I’m making it free, and you don’t have to risk social judgement by going anywhere near crypto.

    I’m especially proud of the app’s design and musical qualities. There are a hundred little details in this thing I could mention that were cool to implement, but users don’t have to know or care about. Although it’s an app made for myself by myself, I’m still inordinately satisfied with and impressed by it. I’ve helped deliver a few apps in my career (some of them even won awards), but DataDeck already feels like one of my favorites.

    I think that’s because designing in the real-world is all about the navigation of compromises — technical debt, financial limitations, organizational will, and a lack of time all get in the way of polishing features you know could be great, or fixing annoying bugs that other stakeholders don’t seem to mind. Personal projects are not like that, and acceleration with AI makes them even less so. I made this thing how I wanted, and was able to tweak the mix or rebuild a cassette’s music logic from the ground up twice a day if I wasn’t happy with it.

    I’ve also been thinking about how narrow the term “vibe coding” is. On one hand, one-shotting an app by asking Claude to “build me a kitchen timer” is vibe coding. But using AI to create a complex tool where humans design the screens, sweat the UX, and look after the details is also kinda vibe coding. I talked recently about how the distinction between designing and developing will fade, and making stuff is all that will matter, and so it stands to reason that eventually coding with AI will just be called coding.

    I spent Friday afternoon with Jussi meeting up with two separate friends, both also middle-aged men, who are similarly interested in this evolution of design/development work, and who are working on their own projects with Claude Code, OpenAI Codex, and other tools. We’re all at different levels of familiarity and sophistication, but it was good to meet for a little co-working + Show & Tell time at cafes on a weekday. I think there’s value in forming a little “late boomers’ coding club” for fellow initiates.

    In any case, I’m hella tired, guys. I started on my next app idea but immediately got hit by fatigue on Saturday afternoon and needed a nap. Switching gears from audio generation to working on more visually-oriented functions was too much context switching to do over the weekend. Think I’ll finish reading a couple of books first before getting back to it.

    I know it’s been app-this and app-that around here for the last month and so maybe some readers (or a future me who’s been thrown in ethics jail for AI use) will appreciate hearing about other things. Let’s zoom all the way out then, into outer space.

    The film adaptation of Project Hail Mary is getting such great reviews and most people in my book club have already seen it. Unfortunately, I have to wait because Kim has finally started reading it, about three years after I told her to. Hopefully she’ll finish before the local IMAX run ends, but nothing in this life is guaranteed.

    There’s just something about stories of people in space, either lost or stranded, alone or in a small team, solving problems with limited resources, all the while confronted by the massive universe-facing perspective of being so small and meaningless. Andy Weir’s The Martian really resonated with people, and Project Hail Mary is having its moment too. I also enjoyed Daniel Suarez’s two Delta-V books a few years back. But the ultimate one that has yet to be beaten for me is Neal Stephenson’s Seveneves.

    The book I’m reading now might be a serious contender though. I’ve had Samantha Harvey’s Orbital on my list for the better part of a year, knowing very little about it, except that it’s about astronauts. Now that I’ve started, I don’t want it to end, I want more of everything, more words from this magnificent brain. You’ll know by the end of the first three pages whether this is a book for you. It’s intensely beautiful, unusual writing. It borders on poetry — perhaps too melodramatic for some — actually it steals over the border by moonlight and maps the territory. I don’t know how Harvey knows what it feels like to be in space, and what astronauts think about as they look down on Earth, but she absolutely does. You can’t write like this unless you’ve stowed away on an ISS mission and been through it. It’s a monumental work, and the best book I’ll probably read all year.

    Literally on the other end of that spectrum, the book club has decided to read Michael Crichton’s Sphere, which is set at the bottom of the ocean and probably isn’t very beautiful or philosophical. I read it once, maybe thirty years ago, and thought I only remembered the contours of its plot, plus flashes of the 1998 film adaptation starring Dustin Hoffman. As I read its opening pages, I was shocked at how familiar some of the writing and scenes were. It must have made an impression on me.

    Since the moratorium on spoilers has probably passed, I think it’s okay for me to mention what I recall: it’s about a mysterious ship that a bunch of scientists are trying to study in a deep sea lab. As time passes, they experience unnatural events, and it’s revealed that the titular sphere onboard has been “having an effect on them”. It’s a mashup of The Abyss and Solaris, essentially. I don’t want to rush Orbital, so I’m going to put that aside and work through Sphere as quickly as I can.

    Speaking of space, the deep sea, and being packed into tight metal containers, I picked up a can of my usual Ayam-brand sardines in extra virgin olive oil the other day and felt a weird “thunk” as I turned it over. I’ve handled enough of these cans now to know when something feels off. Opening it, I discovered only two fish instead of the usual three. That sensation was them loosely rolling around in the oil. It wasn’t like these were two large ones and there wasn’t room — someone on the packing line simply neglected to fill the available space and closed it up. At first I was incensed, and then I tried to let it go. We all deserve to make mistakes, and some sardines should get to enjoy a little more personal space. Be good to yourselves, and I’ll see you next week.


  • DataDeck

    DataDeck

    Introducing the DataDeck SG-01.

    Turn on, tune in, and nerd out at datadeck.app.

    Singapore generates (and publishes) an extraordinary amount of data about itself — temperatures, taxi coordinates, dengue clusters, carpark availability, ticket sales at major attractions. Numbers that civil servants read in spreadsheets and the rest of us ignore entirely. The DataDeck asks, “but what does it sound like?”

    Each Data Cassette draws live government feeds from data.gov.sg and renders them as distinct genres. There are ten cassettes in all, each with their own acoustic logic and ways of interpreting the city.

    The Climate cassette pulls real-time NEA temperature and humidity readings across 12 geographic sectors and converts them into lo-fi hip-hop — with chords deepening as humidity climbs, and the scale drifting toward Lydian as the heat rises. The Transport cassette tracks unoccupied taxis plying the streets and generates a relentless 303-style midnight techno. HDB carparks become polyrhythmic Afrobeat, and the movements of the stock exchange drive a satisfying hip-hop groove. Get money y’all! Check out the sound of visitor arrivals during the COVID years: like musical crickets.

    The controls? Three knobs shape density, tempo, and atmosphere. A mix fader redistributes the instrument balance. AUTO mode hands navigation back to the machine. There’s a user manual built in, should you get lost.

    It’s a music player with no music files. It’s a data dashboard you can close your eyes to. It’s Singapore, rendered in sound. Put your headphones on, and press play.

    Pro tip: If you really love DataDeck, you can save it to your phone’s Home Screen, which gets you a nice icon and a full-screen mode that shows the whole device at once without distractions.


    Disclaimer: I made this with the help of Gemini 3.1 Pro because I’m just an old designer who hasn’t coded stuff since GeoCities. I take no responsibility for any damage you cause yourself or others with this. Thank you.

    Related blog post: Week 13.26