Tag: Vibe Coding

  • Week 10.26

    Week 10.26

    Last week I got started vibe coding with Gemini 3 Pro and was happy enough with the collage-making app I made that I deployed it to Netlify and posted a separate writeup for it here on this site. I also decided to rename it to Collagen, as in Collage-Generator, thanks to a suggestion from Michael.

    For my second project, I wanted to go much further and test the LLM’s ability to code up something more complex, with real-time 3D modeling and rendering. But what to make? One shower later* and I had a concept I was excited to try out: An app called Urban Jungle that would be a weather visualizer, depicting a world where humans have disappeared and our cities have been reclaimed by nature.

    I could see it clearly in my head, and had the idea (in retrospect, a brilliant one that you should absolutely steal) that vibe coding projects should start just like real ones — with concept art. Taking the time to visualize what you want is the first test of whether it deserves to be built. It aligns the team behind a single vision, with fewer chances for miscommunication and wasted time.

    I prompted Nano Banana 2 to generate a screenshot of Urban Jungle as if it were a finished product, describing it exactly how I wanted. The result was astoundingly close to what I’d imagined. With this visual in hand, I was able to brief the coding AI that much faster. Sure enough, the first prototype it spat out nailed the isometric view angle, UI, and core functionality.

    That it could achieve a pseudo-3D effect with CSS and standard web technologies, writing the whole thing in a minute, was already blowing my mind. But like any difficult client, I thought “why not ask for more and see how far I can push my luck?”

    The next version (v1.0) was a total rewrite of the graphics engine, now in full 3D using three.js. Each city is procedurally generated to be unique, with different forest/jungle topographies depending on the region. The increased detail meant I could add decaying buildings, pylons, and roads. When you tap on the trees, flocks of birds scatter. When it gets cold, the vegetation dies, and below 0ºC the ground becomes covered in ice and the birds disappear. I thought… ‘this is great! I think we’re done!’

    But I should have known projects like this are never done. Next came v2.0 which rewrote the architecture to allow it to act like a proper weather app. You can now simulate the weather for any city over the next 24 hours, scrubbing through time with a slider. As you do, the lighting and climate effects change dynamically. It generates live sound effects for wind, rain, birds, and thunder. You can pan and zoom around the model with your fingers. There are now drifting clouds and proper lightning that strikes the earth during storms.

    Comparing the concept art with the ‘finished’ product, and I’d say I got as close as one could hope with a web app contained in a single HTML file. Here’s a standalone post about the app, which brings me to another advantage of vibe coding with an LLM: these things write their own “App Store”-like product copy!

    Try Urban Jungles at urbanjungles.netlify.app

    Edit: I couldn’t leave it alone and after writing the above, made so many changes I had to implement a version history link on the front page. Now in v3.1, there’s an animated starfield in the night sky, and a freakin’ VR mode for Apple Vision Pro! It uses WebXR to place you inside a 3D environment with the city model floating in front of you. And after a couple of people suggested I add iconic landmarks like the Eiffel Tower, I decided that we could have a few as Easter eggs in major cities. Customizing every city to get full landmarks coverage of the world would be too much, even for me. But err check back next week, you never know.

    It strikes me that generative AI vibe coding is modern day Lego. It lets kids and adults alike build silly (or serious) things straight from their imaginations. It’s extremely fun and educational to express yourself in this way, if you just look at it as an advanced toy. The difference is that no one is using Lego to build a working car, or furniture, or anything that can be exchanged for money. But LLMs are already used in the building of most commercial software, and the proportion of their contributions is only going to grow.

    But as a hobbyist with little coding experience, I’m afraid about how desensitizing this new ability can be, and how it will dull our ability to wait for good things. It will heighten our time preference, in other words. While it was downright exhilarating to see my idea come to life in minutes instead of weeks, or never, I know it has already rewired my brain. I expect this now. The next app I make will be judged more harshly — they all will, now that I know how “easy” this is. Patience is going to be impossible, and that’s bad for everyone.

    *But what was that asterisk up there with the shower thought? It occurred to me later that maybe I came up with Urban Jungle because of The Wall which I read last week. To reiterate, it’s a survival story that takes place after an Event seemingly decimates all of humanity save for our female protagonist. She lives in a lodge in the Austrian Alps, getting by on limited matches, ammo, and medications. She’s constantly battered by storms and weather conditions, fighting a slow, losing battle against nature. That imagery must have stuck in my head.


    After that recent aggressive reading spell, I slowed down and decided to chill with one of those cozy Japanese books that are still so popular — you know the ones, set in convenience stores, or bookstores, or cafes, or the backseats of taxis, where absolutely nothing important happens apart from a mild mental breakdown brought on by social anxiety and ennui, aka living in Japanese society. I’ve semi-enjoyed a few of these before, most notably Michiko Aoyama’s What You Are Looking for Is in the Library.

    But even those lowered expectations could not have prepared me for the absolute waste of paper/pixels that is Atsuhiro Yoshida’s Goodnight, Tokyo (translated by Haydn Trowell and published by Europa Editions). I mention all involved parties because the blame for this should be shared. Multiple people started work on this, knew what they had, and decided to keep going. I can only guess the motivating factor was profit and cashing in on this cozy Japanese book trend. I hope it was worth it.

    I am now reading Olga Tokarczuk’s Drive Your Plow Over the Bones of the Dead (translated by Antonia Lloyd-Jones), and it is soooo much more deserving of your time. Then again, she’s a Nobel laureate — perhaps not a fair fight. But that’s the thing about books versus things like Michelin restaurants: the good ones cost about the same.

    I’ll leave you with some non-AI photos I took on a walk yesterday as a palate cleanser.

  • App: Urban Jungles

    Try Urban Jungles at UrbanJungles.netlify.app

    We have always known, somewhere beneath the noise, that the cities we build are temporary. That the concrete will eventually crack. That something green and patient is waiting.

    Urban Jungles shows you the present — your present — transposed onto a world that has already moved on without us. The rain falling on your city right now is the same rain soaking the ruins in the app. The cold front rolling in tonight will strip the canopy bare, just as it strips the leaves from whatever remains.

    It is a diorama of grief and relief in equal measure. A miniature planet after humanity, held in your hands.

    Every city is searchable. Every hour of the next day is walkable. Step forward and watch the light change, the temperature drop, the landscape respond. Hear the sounds of wind and rain passing through empty streets. Orbit the ruins in full 3D — pan, tilt, inspect the overgrowth up close. The shadows track the actual sun, the stars come out at night. Tap the trees and birds scatter.

    Spin it slowly. Let it rain.


    Disclaimer: I made Urban Jungles with the help of Google’s Gemini 3/3.1 Pro LLM and take no responsibility whatsoever for any damage you do with it.

    P.S. If you use it on an Apple Vision Pro, a button for VR mode will appear on the toolbar. It is exactly what you think it is.

  • App: Collagen

    Screenshot

    Use Collagen at usecollagen.netlify.app

    A simple tool for making collages, specifically with album cover art.

    Most collage tools are either bloated with unnecessary social features or too restrictive to be useful. Collagen is a single-purpose utility designed to solve a specific friction: the tedious process of manually sourcing high-resolution album art, aligning it in a grid, and then realizing you want to swap the top-left for the bottom-right. It turns a multi-step design chore into a fluid, drag-and-drop experiment.

    Features

    • Integrated Sourcing: Queries the iTunes database for official, high-resolution artwork (600×600) so you don’t have to hunt for covers or deal with low-res thumbnails.
    • Tactile Reordering: Drag and drop tiles to swap positions instantly. The layout logic handles the movement so you can focus on the visual flow.
    • Flexible Dimensions: Define your grid up to 10×10. The preview and export scale dynamically to match your rows and columns.
    • Hybrid Content:
      • Search: Instant API pulls for mainstream releases.
      • Upload: Support for local files (obscure imports, demos, or personal photos).
      • Text Tiles: Add context or labels with custom text tiles. Features automatic contrast (white/black) and a choice between a clean sans-serif or a classic serif typeface.
    • Borders: Toggle between borderless, white, or black frames. The logic includes outer edge padding for a symmetrical, finished look.
    • PWA Architecture: Built to be “Added to Home Screen.” It caches assets locally on your iPhone for faster subsequent loads and works as a standalone app.
    • Export: One-click generation of a high-resolution stitched PNG. It uses a dedicated image-proxy pipeline to ensure every tile renders correctly without the “blank square” errors common in browser-based canvas exports.

    Disclaimer: I made Collagen with the help of Google’s Gemini 3/3.1 Pro LLM and take no responsibility whatsoever for any damage you do with it.

  • Week 9.26

    Week 9.26

    • The featured image above is the result of having Geese’s Au Pays du Cocaine in my head all day. The line about a sailor in a big green boat and a big green coat made me think of Puffer Jacket Snoopy, and of course I had to realize the joke.
    • We got the sad news that Deliveroo is shutting down operations in Singapore. This comes on the back of an acquisition by DoorDash who must have run the numbers and decided that a 7% share of the local food delivery market after a decade wasn’t worth investing further in. We use it all the time and prefer it over Grab and Foodpanda — it is by far the better app and their subscription service is better value for money, but we’ve seen this movie before. It’s like how Uber lost out to Grab; the market doesn’t always choose efficiently.
    • I will probably switch to Foodpanda because Grab as a brand has the same icky halo as, say, Facebook or Spotify.
    • Google released Nano Banana 2, the new version of their hit image generation model. This one is cheaper to run and kind of almost as good as Nano Banana Pro, so they’re making it the default for everyone. Paid users can still access the Pro model, but it’s hidden behind some menus. It’s a regression in quality, a slight improvement in speed, and most importantly, a boost to Google’s bottom line. Since I only do silly things with these tools, it doesn’t bother me tremendously, but imagine the same happening at an enterprise level for more important work.
    Screen recording of an AI panorama
    • One of the new things Nano Banana 2 can do is generate very wide panoramic images, so I asked it to render some “panoramas taken with an iPhone” in various locations. I then upscaled those and opened them in my Apple Vision Pro. They don’t have the photorealistic quality of images from Nano Banana Pro, and the resolution leaves a lot to be desired, but they’re still immersive and impressive when viewed in this way. You can see where this might go.
    • There’s been a lot of talk lately about how AI vibe coding could upend the SaaS market, if not replacing dependable enterprise tools with individually created ones, then at least giving IT departments a billion more unapproved apps to worry about. A viral essay from last week posited that AI coding could kill DoorDash, though I’d say they did a good job of that themselves out here. The other oft-discussed idea is that AI could replace the App Store, and everyone will just make their own apps instead of buying them from developers. Michael has been blogging about vibe-coding his own to-do list app based on Clear. I’ve been wanting to try this myself, making more little tools of my own to solve niche problems, but the opportunities have been slow to materialize.
    • This week the right idea presented itself and I made a web app using Gemini: an album cover collage maker that searches for the artwork or lets you upload your own. I’ve looked online for something like this before but only found a few that were quite lacking. Making one to my own specifications took maybe five minutes of prompting and testing. Then I thought it would be nice if you could drag the images to different locations. Gemini added that feature like it was nothing. I’m pretty hyped that even someone like me with zero current coding knowledge could will this into existence. If you’d like to try it, I’ve deployed it at usecollagen.netlify.app.
    • Otherwise it was a sort of decompression week where I just read a lot, listened to the records I bought/ordered last week, and was regrettably glued to my phone watching day trading losses (Chekhov’s gun has fired!) and social media feeds.
    • It took a couple weeks of dawdling but I finished John Le Carré’s Call for the Dead, his first novel featuring the spy George Smiley. I may continue reading the series, seeing as his son Nick Harkaway (whose work I really enjoy) has decided to continue his father’s legacy and written one more already: Karla’s Choice. This one was a little dated and not particularly thrilling, but a fine introduction and scene setter.
    • It was immediately followed by Adrian Tchaikovsky’s The Expert System’s Champion, sequel to The Expert System’s Brother which I read at the end of last year. I recommend both as examples of sci-fi stories set so far in the future that humanity has looped back around to the beginning. It reminds me of the “middle chapter” in Cloud Atlas, if you remember that.
    • Then I read Hu Anyan’s I Deliver Parcels in Beijing, a modern memoir that reportedly did well in China when it came out in 2022. It details the author’s dual career as a writer and on-and-off gig economy worker, which is made more interesting by also being a portrait of what it’s like to live in the lower brackets of Chinese society today.
    • I also had time to tackle Rob’s recommendation of Marlen Haushofer’s The Wall, which was written in the 1960s but doesn’t feel that way, unlike Le Carré’s spy novels. He called it the best book he read last year, so I could hardly say no. It starts off like an intriguing sci-fi novel: a woman visiting friends in the Austrian alps wakes up one morning in the log cabin to discover she’s alone, and there’s an invisible wall separating her from the outside world. Things then focus on survival and what it means to live and be human in solitude, and in nature. Which, given that I’ll be home alone next week while Kim is away again for work, means I’m already in the appropriate headspace.
    Some of the better books I’ve read this year
  • Week 43.25

    Week 43.25

    Vertigo (1958) is a great film, because Hitchcock was a master. It’s also the title of a mediocre stadium rock song, because I love hating on U2.

    Unfortunately, vertigo is also something I experienced for the first time this week — I’m fairly sure I jinxed myself at some point earlier this year by saying out loud “I don’t have any problems”. It hit me on Friday night in the form of extreme dizziness and nausea, and even the walk to bed to sleep it off was difficult without support. It got better the next morning with the help of something called the Epley Maneuver, which I found online.

    Asking around, I discovered that this is a more common human experience than you’d think, with several people I know having suffered episodes. Some of them had dizziness lasting days, and yet it’s strangely not discussed like, all the time? From what I can tell, it’s probably something called benign paroxysmal positional vertigo (BPPV), where calcium deposits in your inner ear become dislodged and move around, screwing with your balance. I’ll be seeing a doctor next week to confirm it, but in any case there’s no known cure and it might keep happening for the rest of my life. It’s crazy that so many are just quietly living with this.

    Have I been self-diagnosing with the help of AI? Maybe? I did just sign up for Claude Pro after all, which I’ve mentioned finding more agreeable than ChatGPT. I made vibe-coded two little apps before being laid low: a primitive prototype of my long-gestating stealth game, Cat Creeper, and a tool for my book club to figure out how many chapters of any given book we should read in the coming week. That one is called Book Splitter, and I offer it here for any book clubs out there with a similar need to figure out stopping points conveniently near chapter breaks.

    It’s been a unique experience using Claude’s impressive capabilities alongside reading Asimov’s I, Robot, which foresaw many of our modern discourse around AI safety, and The Optimist at the same time, which has finally begun to chronicle some of Sam Altman’s questionable and unethical moves both at OpenAI and in his private life. The sections detailing his gaslighty, ungenerous, and cruel interactions with his sister Annie ironically reminded me of reading about Steve Jobs’s treatment of his daughter Lisa, in her memoir Small Fry.

    I just passed the part where Dario Amodei and other employees left to start Anthropic. Just as I try to avoid Meta and Google products because of their comparatively weaker stance on privacy versus Apple, it makes sense that some prefer Anthropic over OpenAI for a more cautious approach to AI.

    ===

    My mother-in-law stayed with us this week, which meant getting the newspaper in for her because that’s how some people still get their news. I was shocked to see how thin the physical Straits Times is these days — almost completely devoid of advertising, and on the whole maybe having 20% of the heft I remember from the 90s. It’s also S$1.10 now, up from the 50 to 80 cents I thought it was. Still, it was kinda nice (nostalgic) to sit at the dining table and read the paper in the morning.

    It was also the week where my favorite retro-game-hunting IRL streamer, 4amlaundry, went on a 5-day trip to Kansai, checking out thrift stores and exploring Osaka and Nara. I didn’t want to miss watching it live, so I tried explaining the whole concept of streamers to said mother-in-law, and got her to watch him with me for awhile on the TV, the whole time silently praying that he wouldn’t go look at the display cases of half-naked anime figurines that he sometimes checks out in those stores.

    Thankfully, that didn’t happen, and instead we watched him walk down countryside roads, eat at chain restaurants, and get knocked down by the aggressive deer in Nara. All of that made for some good conversation, so if you get the chance to introduce an elder to Twitch, it’s not the worst idea if you can avoid the NSFW aspects.

    Speaking of shows that you would hope won’t be awkward to watch with your parents or in-laws BUT ACTUALLY ARE, add the latest season of The Diplomat to the list. There’s a lot of cursing (I kinda expected that), and some sex scenes that maybe the producers thought were hot and their audience wanted, but are so unnecessary and desperate that they come across as unintentional comedy. Apart from that, it’s still a fun series that leans into unrealistic political drama, with some unexpectedly good writing (for a Netflix show). Just watch it on your own.

    ===

    I somehow forgot to mention the slate of new Apple products announced last week: M5-powered iPad Pros, a 14” basic MacBook Pro, and a spec-bumped Apple Vision Pro. The product lineup is designed to lead you to the conclusion that you should buy everything, because how do you choose between an 11” and 13” iPad Pro, and a 14” MacBook Pro?

    The 11” iPad size is portable for couch use, but the 13” becomes an advanced desk computer for creative work when paired with the Magic Keyboard and Pencil Pro. But if you’re going to be using it while deskbound, why not get a MacBook Pro with 24 hours of battery life (versus just 10 on the iPad), and the possibility of running local AI models and all kinds of other software that isn’t allowed on iPad?

    Making things harder is the fact that a 13” iPad Pro with accessories costs more than an “equivalent” 14” MacBook Pro, and they’re too costly for an average user like me to justify buying both. So the final decision was to hold out a little longer with my current M1-generation gear, and see what upgrades the 13” iPad Air gets next year — hopefully an M4 or M5 processor, ProMotion, and the aluminum Magic Keyboard currently exclusive to iPad Pro models.

    But bringing the M5 to the Apple Vision Pro makes it a better system to use and own for the next two years, while we wait for the next big leap forward in miniaturization. However as a casual user who only clocks a few hours a week, I couldn’t see myself upgrading for a faster chip alone. The more compelling improvement is a new “Dual Knit Band” that comes as standard, which sorta combines the previous Solo Knit Band and Dual Loop Band into one much-improved design.

    The best part is that this new band is also available as a standalone accessory, so I ordered one immediately for my first-gen AVP. It’s simply a marvel of engineering and feels incredibly premium. The build quality is off the charts, and the Fit Dial they’ve created to independently adjust both the back and top straps might be the most Apple-y thing they’ve shipped on an accessory since the Stainless Link Bracelet for the original Apple Watch.

    Thanks to this more comfortable and ergonomic band, I’d planned to spend more time with the AVP this week, until the vertigo and unusual weekly routine got in the way. Not gonna lie, my first thought during the vertigo attack, after “What if this never goes away and I’m disabled for life?” was “Does this mean I can’t use the Vision Pro anymore?”