Adding frontend observability to michaellamb.dev


After working with Grafana at my company, I decided to see how far the free accounts stretch. After all, I’ve got an internet domain and reasons to observe it, so why not use Grafana for observability?

What I get from Grafana Faro

Grafana Faro dashboard for the blog application

This is the current view of the dashboard with metrics from the blog application I’m observing. It’s activity since my last blog post. Grafana Faro is loaded on the client-side and is blocked by some ad-blockers automatically, yet I get to see how my page performs for an end user for the folks who access my page without one.

Sometimes, users visit a website without an ad-blocker without even realizing it because they’re using something like a social media app and it loads a web browser to keep a user in-app. I’ve noticed most sessions like this are generated from my LinkedIn post advertising last week’s update.

What it took to start using Faro

Grafana makes it very easy to integrate Faro in a frontend app. Either through a web SDK or by importing it through the CDN, anyone can have observability in minutes. Most of the code below is generated within the Grafana dashboard, but I refactored it to work with my development flow. I have observability segmented by environment, so I can see how my website performs in dev and prod.

(function () {
  // Check if we're in a local development environment
  const isLocalDev = window.location.hostname === 'localhost' || window.location.hostname === '127.0.0.1';
  
  console.log('isLocalDev', isLocalDev);
  var webSdkScript = document.createElement("script");

  // fetch the latest version of the Web-SDK from the CDN
  webSdkScript.src =
    "https://unpkg.com/@grafana/faro-web-sdk@^1.4.0/dist/bundle/faro-web-sdk.iife.js";

  webSdkScript.onload = () => {
    // Configure Faro differently based on environment
    const faroConfig = {
      url: "https://faro-collector-prod-us-east-0.grafana.net/collect/...",
      app: {
        name: "blog",
        version: "1.0.0",
        environment: isLocalDev ? "development" : "production",
      }
    };
    
    // Add transport configuration for local development to handle CORS
    if (isLocalDev) {
      faroConfig.transport = {
        mode: 'no-cors'
      };
    }
    
    // Initialize Faro with the appropriate configuration
    window.GrafanaFaroWebSdk.initializeFaro(faroConfig);

    // Load instrumentations at the onLoad event of the web-SDK and after the above configuration.
    // This is important because we need to ensure that the Web-SDK has been loaded and initialized before we add further instruments!
    var webTracingScript = document.createElement("script");

    // fetch the latest version of the Web Tracing package from the CDN
    webTracingScript.src =
      "https://unpkg.com/@grafana/faro-web-tracing@^1.4.0/dist/bundle/faro-web-tracing.iife.js";

    // Initialize, configure (if necessary) and add the the new instrumentation to the already loaded and configured Web-SDK.
    webTracingScript.onload = () => {
      window.GrafanaFaroWebSdk.faro.instrumentations.add(
        new window.GrafanaFaroWebTracing.TracingInstrumentation()
      );
    };

    // Append the Web Tracing script script tag to the HTML page
    document.head.appendChild(webTracingScript);
  };

  // Append the Web-SDK script script tag to the HTML page
  document.head.appendChild(webSdkScript);
})();

What else does Faro do

I don’t really know.

And in most ways, that’s because I’m not a frontend developer – but I do know that Faro allows you to further instrument your application for error tracking and custom signals. And yet, I am a frontend developer, I just don’t prefer it and so work on it begrudgingly. Although I love me some HTML.

In contexts where people care, Faro instrumentation enables things like conversion funnel analysis, feature usage tracking, A/B testing, monitoring and alerting longer API response times, feature adoption, etc.

If that’s you and you want something to help you do that, I suggest Grafana Faro.


Authored by Michael Lamb.
Published on 18 June 2025.
Category: Social


The Tale of the Lazy SME


There was a lazy subject matter expert…

He wanted to be helpful, but he didn’t want it to require a lot of effort. He thought, “The machine intelligence apparatus will be useful for sharing my knowledge efficiently.”

And so he spent days and weeks and months to create the ultimate documentation of his work so that he could share it with others at no more expense of his time.

But what he failed to realize was that it was his time that made the work meaningful.

Because the work was for other people, the individual doing the contributing was more important in the recipe of a matrix organization than the contribution of the individual.

The machine intelligence was not thinking about other people, not the way the individual does.

And so, as he automatically routed the questions others asked of him to be answered by a machine instead of a person, people learned to stop asking him questions.

Then, all the time he spent developing a machine intelligence so he could be lazy was thusly wasted.

And then he led the Butlerian Jihad. THE END.


elsewhere, on the internet

I updated my Letterboxd profile picture SINNERS is a religious experience to me I'm happy to suggest the weight of that impact by this change, as I don't plan to use this picture anywhere else Clarksdale, MS, is a place to meet God, if you care to listen for her

[image or embed]

— michael lamb (@michaellamb.dev) June 2, 2025 at 11:55 PM

Authored by Michael Lamb.
Published on 13 June 2025.


I built a custom Letterboxd diary viewer


I am an avid Letterboxd user and wanted to create a custom viewer for my diary entries. Since college, I’ve known about HTML5 UP, a website that offers free templates with beautiful, responsive designs. I decided to give it a try and see if I could create a viewer with it.

HTML5 UP Template

Here’s what the template looks like. It’s named Multiverse.

Multiverse

Letterboxd Viewer

And here’s how the viewer turned out.

Viewer

Contact form

The template features an About section which contains a contact form. For simplicity and ease of use, I opted to publish new submissions to a Discord webhook in a private channel on my server. The form submits to an AJAX call which formats the message and sends it to the webhook. The webhook then posts the message to the channel.

Webhook

Architecture

Architecture

Check out the repo here and feel free to fork and modify it to your needs. The README should have all the information you need to get started if you know how to poke around some code.

Open the Letterboxd Diary Viewer now


Authored by Michael Lamb.
Published on 21 February 2025.
Category: Social


Sherlock Project


Today, I’m writing to feature the Sherlock Project. Searching over 400+ social media networks, Sherlock can help you find accounts by username.

Installation

On a Mac OS device, I used Homebrew to install the sherlock CLI tool. Check out the installation instructions for more information for your system.

my results - michaellambgelo

I searched for my own username and found a few results. I’m sure there are more, but I’m not going to list them all.

Here was every account Sherlock found for my username:

https://www.codewars.com/users/michaellambgelo
https://discord.com
https://hub.docker.com/u/michaellambgelo/
https://dribbble.com/michaellambgelo
https://www.duolingo.com/profile/michaellambgelo
https://www.fiverr.com/michaellambgelo
https://freesound.org/people/michaellambgelo/
https://genius.com/michaellambgelo
https://giphy.com/michaellambgelo
https://www.github.com/michaellambgelo
https://gitlab.com/michaellambgelo
https://hackenproof.com/hackers/michaellambgelo
https://hackerone.com/michaellambgelo
https://hackerrank.com/michaellambgelo
https://holopin.io/@michaellambgelo
https://imgur.com/user/michaellambgelo
https://www.producthunt.com/@michaellambgelo
https://pypi.org/user/michaellambgelo
https://www.reddit.com/user/michaellambgelo
https://www.roblox.com/user.aspx?username=michaellambgelo
https://scratch.mit.edu/users/michaellambgelo
https://www.shpock.com/shop/michaellambgelo/items
https://slideshare.net/michaellambgelo
https://www.snapchat.com/add/michaellambgelo
https://soundcloud.com/michaellambgelo
https://open.spotify.com/user/michaellambgelo
https://steamcommunity.com/groups/michaellambgelo
https://steamcommunity.com/id/michaellambgelo/
https://www.strava.com/athletes/michaellambgelo
https://ch.tetr.io/u/michaellambgelo
https://tldrlegal.com/users/michaellambgelo/
https://trello.com/michaellambgelo
https://www.twitch.tv/michaellambgelo
https://xboxgamertag.com/search/michaellambgelo
https://www.youtube.com/@michaellambgelo
https://www.baby.ru/u/michaellambgelo/
Total Websites Username Detected On : 36

Typically, I’ve reserved the moniker michaellambgelo for personal accounts distinct from more professional accounts where I am inclined to discuss my career or I may promote acompany. It is a reminder to myself that there is a time for work and a time for play – and michaellambgelo is meant for play.

Upcoming

View on Threads
  • I am exploring cloud hosting to begin user testing of a new Discord Member Matchmaking app
  • I will be hosting a movie night featuring the film EVANGELION:1.11 YOU ARE (NOT) ALONE. this Sunday at 4 PM Central Standard Time in the C Spire Gaming Discord server

Join the C Spire Gaming Discord server at the widget below

Thank you for reading!


Authored by Michael Lamb.
Published on 28 January 2025.
Tags: feature


My experience with the Intelligence we have given machines


I have not posted a blog update since March when I announced that I had purchased an M3 MacBook Air. My curiosity with local LLMs was relatively short-lived, mostly because my life is already a lot of tinkering and the true value in running a local LLM is to customize it to do what you want — but I don’t always know what I want.

Local LLMs

I played around with a few different LLMs using Ollama. Ollama provides a CLI tool for managing models and serving them locally. Most 7B models work well on my M3, but higher parameter models were too slow. Still, the only real benefit was that I could run a local model and use it to generate text, it just wasn’t very good or useful text.

I installed Ollama Autocoder to interact with local models within VS Code. Though this provided a neat autocomplete feature, most of the suggestions were poor.

ChatGPT

My experience with ChatGPT has been mixed at best. It does okay with some generations but more often than not it is less than helpful in any coding problems. I have found some success in using ChatGPT to generate the incorrect answers in a trivia game, but mostly it has become a spellcheck and sentiment analysis tool for my correspondence.

Codeium

When C Spire began exploring AI tools, leadership decided to run a pilot program. Three groups of developers were assigned to the pilot: one was the control group, the others were split between GitHub Copilot and Codeium.

When Codeium received the best marks among developers, the rest of the development teams adopted the tool. I have had the fortunate position to speak with developers both freshman and senior and glean from them the value they find in generative AI. Junior developers are more enthusiastic about the tool as it provides feedback for them to iterate on their problems independently — a goal which always seems to be front of mind for junior devs as they perceive their senior team members’ time as more valuable. Senior developers either use the tool with specific use cases or they don’t use it at all.

In my own development work with C Spire I have used Codeium, the biggest help to me being frontend tasks which require more layout or lots of file touches. Some of our older frontend apps are large and unwieldy but Codeium has resulted in fewer headaches for me when making changes in those apps. The newer apps (in frameworks like Angular) are much more pleasant for me to work in, and even better to do work with Codeium.

Windsurf and Cascade

In my personal capacity as a software engineer, I’ve recently adopted Windsurf and Cascade and found a lot of joy in accomplishing things quickly. The motivating reason for trying out Windsurf was to create a custom stream overlay for my Twitch channel. I wanted something that would make my stream look professional and Cascade helped deliver that for me.

Windsurf is the IDE created by Codeium to work with their AI models. Cascade is the agentic AI model built into Windsurf. One important difference I’ve noted is that Cascade does more than simply respond to a prompt — it thinks ahead and plans accordingly.

I’ve employed Cascade to analyze and refactor aspects of this very blog site (most changes are under-the-hood).

What’s next

I have a few ideas for things I’d like to create but haven’t had the time, motivation, or resources to accomplish. With Cascade at my fingertips, I’m eager to build something new.


Authored by Michael Lamb.
Published on 09 January 2025.



About michaellamb.dev

Michael Lamb is a software engineer working at C Spire. If you have a blog-specific inquiry please create a new issue on GitHub. Feel free to fork this blog and build your own!

Get to know who I am in my first post Hello, World!

© Copyright 2021-2025
Michael Lamb Blog