Chris Padilla/Blog


My passion project! Posts spanning music, art, software, books, and more. Equal parts journal, sketchbook, mixtape, dev diary, and commonplace book.


    Customizing Field State in React Final Form

    React Final Form is a delightful library. So much comes out of the box, and it's extensible enough to handle custom solutions.

    I've been working with it a fair amount recently. Here's what adding custom logic looks like:

    Scenario

    Say that you have data that needs to update two fields within your form. An example might be that you have an event form with the fields:

    • Name: Skate-a-thon 2022
    • Start Date: August 3rd
    • End Date: August 5th

    Let's also say that Skate-a-thon got rescheduled to another weekend.

    When going in to push back the start date, we want our form logic to automatically update the end date as well - to also go back 7 days.

    Simple Set up In Final Form

    I'll leave the initial set up to the React Final Form docs.

    Let's pick up with a component that looks like this within our form:

    <h2>Render Function as Children</h2>
    <Field name="name">
      {({ input, meta }) => (
        <div>
          <label>Name</label>
          <input type="text" {...input} placeholder="Name" />
          {meta.touched && meta.error && <span>{meta.error}</span>}
        </div>
      )}
    </Field>
    
    <Field name="startDate">
      {({ input, meta }) => (
        <div>
          <label>Start Date</label>
          <input type="date" {...input} placeholder="Start Date" />
          {meta.touched && meta.error && <span>{meta.error}</span>}
        </div>
      )}
    </Field>
    
    <Field name="endDate">
      {({ input, meta }) => (
        <div>
          <label>End Date</label>
          <input type="date" {...input} placeholder="End Date" />
          {meta.touched && meta.error && <span>{meta.error}</span>}
        </div>
      )}
    </Field>

    We're assuming we already have our <Form> wrapper and submit button elsewhere.

    So far with this code, all of our fields will update independently.

    To tie together our endDate field with the startDate, we'll need a custom onChange method and a way to access the form API ourselves.

    Form Hook

    Two hooks come in handy here:

    • userForm(): Gives us all access to utility methods for our form. It can be called in any component within the <Form> wrapper.
    • useFormState(): As the name implies, gives us access to the current state, including values and meta data such as what fields have been "touched"

    We'll open up our component with these hooks:

    import React from 'react';
    import {Field, useForm, useFormState} from 'react-final-form';
    
    const FormComponent = () => {
      const formApi = useForm();
      const {values} = useFormState();
    
      return(
        ...
      )
    };

    And then use these in a custom onChange handler on our field

    <Field name="startDate">
      {({ input, meta }) => (
        <div>
          <label>Start Date</label>
          <input
            type="date"
            {...input}
            placeholder="Start Date"
            // Custom onChange below
            onChange={(e) => {
              const newStartDate = e.currentTarget.valueAsDate;
    
              // Lets assume I've written a COOL function that takes in the
              // Initial values for startDate and endDate, and calculates a
              // new endDate based on that
              const newValue = calculateNewDateBasedOnNewStartDate(newStartDate, values);
    
              // Update both values through the formApi
              formApi.change('startDate', newStartDate)
              formApi.change('endDate', newValue)
            }}
          />
          {meta.touched && meta.error && <span>{meta.error}</span>}
        </div>
      )}
    </Field>

    There you go! The endDate will update alongside the start date from here!


    Keeping it 200 — HTTP Status Codes

    I was getting a 503 waiting for my Gmail to load, so I decided to learn more about status codes while I waited!

    Response Classes

    1. 100-199: Informational Responses
    2. 200-299: Successful Responses
    3. 300-399: Redirection Messages
    4. 400-499: Client Error Responses
    5. 500-599: Server Error Responses

    Noteworthy Codes

    • 418: I'm a teapot Short and stout

    • 302: Found For temporary changes in URI, as opposed to the more permanent 301.

    • 300: Multiple Choices Where there's more than one possible redirect. The best way to handle this is to send an HTML page with possible choices for handling the response. But then, at that point, wouldn't you just sent a 200 status code with a landing page? I could see this being more useful with APIs than webpages.

    • 202: Accepted The request is received but not acted upon. Interestingly, HTTP doesn't have a way to handle an asynchronous response that ties itself to the delayed response to the request. I'm thinking of AWS batch uploading here - where the API accepts the data, but then needs to take however long to actually process the request.

    • 502: Bad Gateway Technically the appropriate response if the server gets an invalid response when waiting on an external request.

    • 404: Not Found Had to finish with my favorite.


    Cleaning Local Branches

    It's not spring, but it is time for cleaning! Here's a pretty simple way to clear up local git branches.

    $ git branch -d $(git branch | grep -v "develop\|master")

    Breaking It Down

    $ git branch -d <branch>

    This deletes the branch locally.

    If you wanted to delete remotely, this command does the trick (be mindful you're not deleting teammates branches, though):

    $ git push origin --delete <branch>

    To fill in multiple arguments, there's a few pieces of glue needed:

    • We'll use the $() command substitution syntax in Linux to fill in our list of branches
    • Within it, we'll pipe all the branch names from git branch
    • That will filter through our regex search through grep of all branches that aren't develop or master. (Inverted with the -v flag)
    • "develop|master" matches all branches that aren't develop or master.

    Testing the Regex

    To make sure the right branches are being grabbed, removing the outer delete command will return the list of branches:

    $ git branch | grep -v "develop\|master"

    Pruning Deleted Remote Branches

    Removed remote branches can be cleared from your local computer with this command:

    git fetch -p

    DIY Analytics & CORS

    I've been exploring analytics options. I have a use case for them, but we're more concerned with specific user behavior on this project. We want to know if they click a certain button, or make it to a certain page.

    There are some options. Google Analytics provides journeys and goals, though it's heavy handed for our use case. Other solutions like Fathom would keep track of individual page performance, but there are certain UI interactinos that we're interested in.

    So the need arose! I wrote a custom solution for our app.

    Overview & Stack

    We're using React on the client side. Since button interactions are our main metric, we essentially need something that can be integrated with our click handlers.

    Easy enough! We can fire off a POST request to an external API that records the interaction.

    For the API, I opted to spin up a Next.js API. A single serverless function may have been more appropriate, but I was short on time and know that I can deploy quickly with Next and Vercel.

    For storing data, I created a new Database with MongoDB Atlas. Similarly here, this may more than what we really need, but familiarity won out!

    Client Side

    In my React app, I'm adding this utility function that fires when ever I want to record an action:

    export const recordInteraction = (type) => {
      const data = { type };
    
      fetch('https://analytics-api.vercel.app/api/mycoolapihandler', {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
        },
        body: JSON.stringify(data),
      })
        .then((response) => response.json())
        .then((data) => {
          console.log('Success:', data);
        })
        .catch((error) => {
          console.error('Error:', error);
        });
    };

    type here is simply a string for what event is recorded. It could be "Submits form", "turns on dark mode", or any unique way of identifying an action!

    The rest is your run of the mill fetch request handling.

    API handler

    On the other side of our endpoint is this handler:

    import { connectToDatabase } from '../../lib/mongodb';
    
    export default async function handler(req, res) {
      await runMiddleware(req, res, cors);
    
      if (req.method === 'POST') {
        const { client, db } = await connectToDatabase();
    
        const myObj = {
          date: new Date(),
          type: req.body.type,
        };
    
        const dbRes = await db
          .collection('acnm')
          .insertOne(myObj, async function (err, res) {
            if (err) throw err;
            console.log('1 document inserted');
            await client.close();
          });
    
        res.status(200).json({ message: 'Data recorded' });
      } else {
        res.status(200).json({ name: 'Ready to record' });
      }
    
      return res;
    }

    First, you'll notice runMiddleware(). I'll explain that in a second!

    If we receive a POST request, we'll do the following:

    • Connect to the DB (largely borrowing from Next.js's great example directory for setup)
    • Create myObj, a record of the current time of request and the type of request.
    • insert it into the database
    • return a success message

    CORS Middleware

    There are a few steps I'm skipping - validation, schema creation, sanitization. But, this api is only going to interact with my own application, so I'm not concerned about writing a very extensive request handler.

    The way I'm keeping it locked down, out of harms way from the world wide web, is through CORS.

    An incredible thorough look at CORS is available at MDN. For our purposes, we just need to know that this is how our API will whitelist only our application's url when receiving requests.

    Back to the runMiddleware() method! Here is the function declared in the same document:

    import Cors from 'cors';
    
    // Initializing the cors middleware
    const cors = Cors({
      methods: ['POST', 'GET', 'HEAD', 'OPTIONS'],
      origin: ['https://mycoolapp.netlify.app', 'https://mycoolapp.com'],
    });
    
    function runMiddleware(req, res, fn) {
      return new Promise((resolve, reject) => {
        fn(req, res, (result) => {
          if (result instanceof Error) {
            return reject(result);
          }
    
          return resolve(result);
        });
      });
    }

    The cors npm package is a great way of managing CORS without getting into manipulating the headers directly. In our instantiation, we're passing a few options for approved methods and the origins we want to white list. (NO trailing slash, FYI!)

    runMiddleware() is the simple wrapper function that handles us using the cors middleware with our request.

    Using the Data

    The nice thing about using Mongodb, or any full blown DB with a sophisticated querying language, is the ability to make use of your data! Our model only has a few simple terms:

    {
        _id,
        date: Date,
        type: String,
    }

    But, that's plenty for us to be able to answer questions such as "How many people submitted a form in the last month." A perfect solution for our case.


    Storing Keyword Arguments in Python Class Instantiation

    Here's the sitch: We recently had an all hands project in Python where we needed to create a pretty flexible internal package.

    Part of that flexibility means not always knowing all the arguments we're instantiating with, but knowing that if certain arguments are available, we want to use them in our methods.

    Here's what I mean:

    class PizzaOven(object):
        def __init__(self, **kwargs):
            # Take in toppings and store for later.
    
        def bake(number):
            for x in range(number):
                if self.pineapple and self.ham:
                    print('Here's one Hawaiian Pizza')
                else:
                    print('Here's your tasty pizza')

    We have a class that will receive toppings as keyword arguments, and our method later on may need to extract those arguments to know if it's a Hawaiian Pizza. In another case, if we're dealing with a third party api, we may need to plug those values into a method in a certain order, or only use them if they're provided, like an options dictionary.

    Static Dict Property

    When constructing classes, python stores it's writable attributes to the __dict__ class attribute.

    Doing this:

    class PizzaOven(object):
        def __init__(self, **kwargs):
            self.ham = kwargs.ham

    Is somewhat like doing this:

    class PizzaOven(object):
        def __init__(self, **kwargs):
            self.__dict__["ham"] = kwargs.ham

    The code above would error out since that's simply not how we do things around here! But we'll cover how this concept helps out with keyword arguments below.

    Updating Dict with Keyword Arguments

    By using the update dictionary method, we can take in an object that instantiates the class with our toppings and store them as instance variables.

    class PizzaOven(object):
        def __init__(self, **kwargs):
            self.__dict__.update(kwargs)

    Then our toppings are made available later:

    class PizzaOven(object):
        def __init__(self, **kwargs):
            self.__dict__.update(kwargs)
    
    dominos = PizzaOven(pepperoni=True, ham=True)
    
    dominos.bake(1) # "Here's one Hawaiian Pizza"

    Bon Appétit!


    Adding Background Music to Websites

    I'm working on a project where I'm adding my own music for each page in a React app.

    It has me nostalgic for the early internet. You would be cruising around, and all of a sudden, someone's Live Journal would have a charming MIDI of Enya's "Only Time" playing in the background.

    I'm definitely glad this isn't the norm anymore. I don't miss having to mute the obnoxious banner ads that included sound on Ask Jeaves. But now that the web is largely without sound as a background to pages, I've really enjoyed how it's bringing parts of this application to life!

    Quick Overview

    The project isn't out yet, so for the heck of it, let's call my React app "Music Box."

    The music is hosted on the Music Box's CDN (Sanity, in my case). The ideal format would be webm as it works across all modern browsers and is highly performant. For my use case, mp3's suited me just fine.

    In my codebase, I have a big ol' object that stores the track URL's and which page ID's they should play on. It looks something like this:

    const sounds = [
      {
          name: 'Bake Shop',
          src: 'https://cdn.sanity.io/files/qvonp967/production/f4163ffd79e09fdc32d028a1722ef8949fb31b85.mp3',
          conversationIDs: [
            '27f4be58-38f3-4321-bbc9-c76e0c675c36',
            'd008519f-16c0-4ef0-b790-f5eb0cb3b0b4',
          ],
          howl: null,
        },
        {
          name: 'Restaurant',
          src: 'https://cdn.sanity.io/files/qvonp967/production/4606e7ec6208df214d766776e3d5ed33408fe74d.mp3',
          conversationIDs: [
            'e1688c5f-218a-4656-ad96-df9a1c33b8f8',
            'a81fb6a7-d450-45e8-a942-e5c82fb1a812',
          ],
          howl: null,
        },
        ...
    ];

    You'll notice each object also has a howl property. Let's get into how I'm playing sound:

    Playing Audio with Howler.js

    Howler.js is a delightfully feature-full API for handling sound with JavaScript. The library is built on top of the Web Audio API and also uses HTML5 audio for certain use cases. While I could have interfaced with the Web Audio API directly, Howler has much nicer controls for using multiple sounds, interrupting them, and keeping separate sound instances contained in a single sound palette.

    For each page, we initiate the aproriate sound with this code:

      const initiateSound = (src) => {
        const sound = new Howl({
          src,
          loop: true,
        });
    
        return sound;
      };

    src here is derived from the url. The loop option is turned on so that we get continuous music.

    Changing Audio Page to Page

    This is all kept in a SoundController component at the top level of the react tree, above React Router.

    function App() {
    
        ...
        return (
        <>
          <SoundController />
          <Switch location={location} key={location.pathname}>
              <Route
                path="/testimony/:id"
                render={(props) => <Testimony match={props.match} />}
              ></Route>
              <Route path="/act-one">
                <ActOneTestimonySelect />
              </Route>
              ...
          </Switch>
        </>
        )
    };

    The main reason for this is so we have control over fading in and out between pages.

    The other reason is for caching. Remember the howl properties in the sound array? That array is going to be stored in a useRef() call in the SoundController component. Then we can save each instantiated sound with the appropriate element in the array for future reference.

    That's exactly what is happening here inside the useEffect. This code listens for a change in the currentTrackObj (triggered by a page change) and checks if we have a cached howler instance. The cache version is targeted if so, and a new one is played if not.

      useEffect(() => {
        if (currentTrackObj) {
          let howler;
    
          if(currentTrackObj.howl) {
            howler = currentTrackObj.howl;
          } else {
            howler = initiateSound(currentTrackObj.src);
          }
    
          currentTrackObj.howl = howler;
          howlerRef.current = currentTrackObj.howl;
          if (soundPlaying) {
            howlerRef.current.play();
          }
        }
    
        return () => {
          if (howlerRef.current && howlerRef.current.stop) {
            howlerRef.current.stop();
          }
        };
      }, [currentTrackObj]);

    Playing and Pausing

    The state for this is stored in redux as soundPlaying. When that's toggled, we can interface with howler to play and pause the track.

      useEffect(() => {
        if (!playedAudio) {
          dispatch(setPlayedAudio(true));
        }
    
        if (howlerRef.current && howlerRef.current.playing) {
          if (soundPlaying && !howlerRef.current.playing()) {
            howlerRef.current.play();
          } else {
            howlerRef.current.pause();
          }
        }
      }, [soundPlaying]);

    Then that's it! Musical bliss on every page!


    Error Tracking

    Solo deving a fairly large app has humbled me. Errors and bugs can sneak in. Even professional software ships with errors. I've had to accept that it's part of the feedback process of crafting software.

    A lot of the development process for me in this project has been as follows:

    1. Ship new feature
    2. Collaborator tests it and finds a bug
    3. Collaborator tells me something is broken, but with no code-related context they can share
    4. The hunt ensues

    Not ideal! I've been looking for ways to streamline bug squashing by logging pertinent information when errors occur.

    Swallowing pride and accepting a path towards sanity, I've integrated an APM with my app. Here are the details on it:

    Using Sentry

    I opted for Sentry. I did some research on Log Rocket and Exceptionless as well. All of them are fine pieces of software!

    Log Rocket includes a live action replay. Exceptionless provides real time monitoring. And Sentry is focused on capturing errors at the code level.

    For my needs, Sentry seemed to target exactly what I was experiencing — purely code-level issues.

    Integration

    Integrating is as simply as a few npm packages and adding this code to the index.js:

    import React from "react";
    import ReactDOM from "react-dom";
    import * as Sentry from "@sentry/react";
    import { Integrations } from "@sentry/tracing";
    import App from "./App";
    
    //Add these lines
    Sentry.init({
      dsn: "Your DSN here", //paste copied DSN value here
      integrations: [new Integrations.BrowserTracing()],
    
      tracesSampleRate: 1.0, //lower the value in production
    });
    
    ReactDOM.render(<App />, document.getElementById("root"));

    One tweak I had to make was to set Sentry to only run in production. (I'm fairly certain I'll see the errors in development, thank you!)

    if (process?.env.NODE_ENV === 'production') {
      Sentry.init({
        dsn: "Your DSN here", //paste copied DSN value here
        integrations: [new BrowserTracing()],
    
        tracesSampleRate: 1.0,
      });
    }

    With that bit of code, Sentry then comes with the stack trace, the OS and browser environment, potential git commits that caused the issue, and passed arguments. All sorts of goodies to help find the culprit!


    Analytics - Accuracy and Ethics

    I don't personally use analytics on this site. I'm not here to growth hack my occasional writing for ad space. But I am involved in a couple of projects where analytics is good feedback for what we're putting out. So I did a little bit of a deep dive.

    Accuracy is Suspect

    Uncle Dave Rupert and Jim Nielsen have striking comparisons between their different analytics services. The gist is that they are serving up WILDLY different data, telling different stories.

    It's not just that Netlify numbers are generally higher than Google Analytics, either. If you follow one service, the data could tell you that you had fewer visits this month, while the other claims you had more.

    Part of this is because of the difference between how the data is gathered.

    Server Side Analytics measures requests. Client Side loads a script on page load.

    There are pros and cons to both. Client side analytics can better map sources of leads and measure interactivity, but is prone to JS being turned off or plugins blocking their usage. Server Side is prone to inflated numbers due to bot traffic.

    So it seems like the best solution is to have multiple sources of information. Of course that extends to having more metrics than purely quantitative, as well.

    Privacy and Ethics

    Tangentially, there are some ethics around choosing how to track analytics and who to trust with this.

    It's an interesting space at the moment. Chris Coyier of CSS Tricks has written some thoughts on it.. I feel largely aligned. The gist is: aggregate, anonymous analytics is largely ok and needed in several use cases. Personally identifiable analytics are a no-no.

    But I understand that even this “anonymous” tracking is what is being questioned here. For example, just because what I send is anonymous, it doesn’t mean that attempts can’t be made to try to figure out exactly who is doing what by whoever has that data.

    This is key for me. History has told us that if we're not paying for a service, we are likely the product. And so, any analytics service that doesn't have a price tag on it to me is a bit suspect.

    I can't say I have any final conclusions on that matter. Nor any say that X is right and Y is wrong, I have no shade to throw. But as I step more and more into positions where I'm a decision maker when it comes to privacy, I'm working to be more and more informed, putting users best interests at the center.


    Git Hygiene

    My recent projects have involved a fair amount of disposable code. I'll write a component for an A/B test, and then it needs to be ripped out after the experiment closes.

    Git has simplified this process beautifully!

    I could manually handle the files, deleting line by line myself. But git makes it so that I can run a few commands in the CLI to revert everything.

    Here's my workflow for it:

    Modular Commits

    I've been guilty of mega commits that look something like this:

    git commit -m "render revenue data to pie chart AND Connect ID to Dashboard AND move tiers to constants file AND ..."

    I've recently made the switch to breaking out any instance where I would want to put an "and" in my explanation of the change into it's own commit. So now my commits will look more like this:

    $ git commit -m "render revenue data to pie chart"
    $ git commit -m "Connect ID to Dashboard"
    $ git commit -m "Move tiers to constants file"

    There are loads of benefits to this. To anyone reviewing my code, it's far easier to follow the story told by my commits. Isolating a breaking change is much easier.

    The best, though, is that it's WAY easier to isolate a commit or few that needs to be thrown out later.

    Revert Commits

    The word comes back from marketing: The first A/B test was a success, but the second needs taking out.

    If a single commit needs changing, it's as easy as this:

    $ git revert 9425e670e9425e66d61c8201...

    git revert will then create a commit with the inverse of those changes.

    Usually, I need to do this with multiple files. The workflow isn't too different:

    $ git revert --no-commit 820154...
    $ git revert --no-commit 425e66...
    $ git revert --no-commit 9425e6...
    $ git commit -m "the commit message for all of them"``

    Push and merge from there!


    Fluency

    I'm thinking a lot about this thread by multi-instrumentalist and composer Carlos Eiene

    For me, this is the key phrase:

    Where is the fluency line with an instrument? ... I think a closer answer is having the necessary abilities to effectively communicate in whatever situation you may be in. And if you're in a vacuum, learning an instrument by yourself without ever playing it for or with others... you don't get the chance to communicate musically.

    (Putting aside the whole argument for or against language as an analogy for music here.)

    In Music

    This is such a given in music school. You are jamming with musicians all the time, getting feedback, and performing alongside each other all the time.

    For me, it's been interesting transitioning musical communities.

    The main point of the thread is to deemphasize practicing for the sake of mastery alone. To focus on how you serve musically and how you can still effectively communicate with other musicians.

    I'm thinking a LOT about the inverse, though. How do you find that same community and immersion in a musical context that's a lot more individualist than, say, being in a concert band or jazz combo? Where does the feedback come from there?

    When it comes to writing music, I feel like it's much more in the vein of how I imagine authors write. Or Jazz musicians working on transcriptions, actually. You're not limited by time or space. You are communicating and riffing off of someone's ideas that could be from decades ago. I think a present, accessible community is of course important. But online communities are much more lightweight than when you're in a group that rehearses every week together. And so, filling in the gaps takes working with recordings and materials.

    Speaking as an ambivert, this way of connecting musically is pretty amorphous. The buzzword now is that many relationships online are "parasocial." And don't get me wrong, there's beauty to it, too. I love being able to transcribe a Japanese musician's X68000 chip music so easily and readily, there's an interesting kind of intimacy to that engagement with music. The feedback and communication is strange, though. It's not direct communication, and the community, again, is less tangible.

    Anyhow — sometimes I miss in person music making. Maybe I shouldn't expect writing music to be the same kind of fulfilling. For me, the lesson is that music is multifaceted. Different acts in music can balance each other out. We write to express individualism. We perform to connect with a larger community.

    In Code

    This got me thinking with code languages as well.

    There's a spectrum. Folks who are renaissance devs, those who have dipped their toes in many technologies, are fluent in multiple languages and frameworks, etc. And there are folks who are highly specialized.

    Namely, in web development, is it worth going broad or focusing in?

    (Short answer: go T Shaped)

    The answer comes from community, or maybe more importantly, what your problems are your clients grappling with?

    That, too, is a spectrum. If you're aiming for the big companies, python, data structures, and a CS degree in your back pocket helps. If you're doing client work, breadth wins out. If you're an application developer, it may be a more focused in set of JS centric technologies.

    Like music, the field is too large and varied to really say one size fits all.

    No matter what, though, mastery isn't necessarily the goal. Here, it is fluency.

    Some projects may require that intimate knowledge of JS runtime logic.

    Others may only need some familiarity with JQuery.

    The interesting thing about this field, in my mind, is that it's a lot less about working towards a specific target for fluency, but using the tools you have to solve a problem for your collaborators.

    Learning is a natural part of that process. So there is both a really tight feedback loop and there's natural growth and development built in.

    (Again, caveat here to say it's not an excuse to slack on developing your skills. But working towards fluency can keep it so that you are working to master relevant skills vs. simply being virtuosic in an irrelevant way.)

    Back to Music

    The difference here is that software solves a direct problem for someone else. It's creativity with a practical outcome. With music, there's more magic. ✨ The outcomes are less clear, the people you serve and communities you entangle with are less defined. The benefits, even, are vague at times.

    Except, y'know, your soul grows in the process. And simply being creative in the world and sharing that creativity can lead to inspiring others to do the same.


    Geeking Out Over Notion

    You guys, I'm just really jazzed about a piece of software over here.

    I think for anyone that codes, there's just a little bit of the person who organizes their sock drawer in us all. Organization and systems are a big part of the job. And so, our project management has that element to them too.

    In that arena, Notion has just been SO pleasant to use.

    Kanban Board

    My primary use for it is the board view. This alone has been huge, and maybe actually, this is more of a blog about why kanban boards are the best.

    Let me set the scene: Jenn and I start our big game development project together. We're excited, energy is high, and we have lots of brain-space mutually for where things are and what our individual tasks are.

    Then the project gets BIG. The list of features are long, a log of bugs crop up, and we don't have a unified spot to keep notes on individual features.

    Notion Board with Cards under Analysis, Development, and Awaiting Input

    The beautiful thing about a board is that we can keep track of multiple features, ideas, and bugs. From a glance, we can see what's on deck for developing, researching, and giving feedback.

    What's especially cool about Notion's boards is that you can open the card up into it's own document!

    Say that Jenn makes a card called "Add Pizza to Inventory."

    We have a comment function on the card where we can have a conversation over what toppings should the pizza have, when to add the pizza to the inventory, and so on.

    Under that is space for writing on the document. Anything goes here - adding screenshots, keeping a developer todo list, keeping notes from research. So all the details around that feature is kept in one spot.

    What happens often is that we'll talk about an idea, leave it for months, and then have to come back to it. With a comment thread and notes from development, it's that much easier to pick it up and work on when the time comes.

    Guides and Meeting Notes

    Notion is mostly marketed as one of those "everything-buckets", similar to Evernote or Google Drive.

    I'm personally a believer in plain text and just using your file system for note keeping. But, collaboratively, having a hub for all things project related is unbeatable.

    On top of our progress with the board, we used documents for writing meeting notes and keeping track of guides for using Sanity. We both always have the most up to date info, as Notion syncs automatically with any changes either of us makes.


    Automatically Saving Spotify Weekly Playlists

    With friends, I've been talking for ages about how channels of communication have different feelings and expectations around them. Sending a work email feels different from sending an instagram DM, which feels different from texting, which feels different from sending a snap on Snapchat.

    For me, the same is true for music apps. I have Spotify, Tidal, Bandcamp, and YouTube accounts with different musical tastes and moods. Especially since these apps all have algorithms for recommending music, I like each to be tuned into a certain mood.

    It just feels strange having a Herbie Hancock album recommended next to the new Billie Eilish, even though I would listen to both!

    SO I have my Spotify Discover Weekly playlist fine tuned to curate a great mood for work with mostly instrumental music. BUT I have to manually save the playlist every week, or else it's gone to the ether.

    Naturally, I was looking to automate the process! Having worked mostly in JavaScript and React so far, I saw it as a great chance to explore scripting in Python.

    What It Does

    This light script does just a couple of things.

    Of course, it gets and reads the current Discover Weekly playlist data, creates a new playlist, and adds all the new tracks to that playlist.

    It also implements a custom naming convention. I have sock-drawer-level organization preferences for naming these playlists. I like to name these by the first track name and the date the playlist was created. Names end up being:

    • An Old Smile 04/05/22
    • Mirror Temple 03/28/22
    • Apology 3/21/22

    This includes a little bit of trimming – Some track names end up being ridiculously long, sometimes nonsensicle. (looking at you, ⣎⡇ꉺლ༽இ•̛)ྀ◞ ༎ຶ ༽ৣৢ؞ৢ؞ؖ ꉺლ, an actual artist recommendation.) So there's a very simple shortening of the name if needed.

    Using Spotipy and the Spotify Web API

    Spotify already has an exposed web API for doing just what I needed – creating playlists and adding tracks. Doing so, like other OAuth applications, requires providing user authentication and scopes.

    To simplify the authentication and communication, I opted for the lovely Spotipy library. Simple and intuitive, the library handles the back and forth of authenticating the application with Spotify's Web API and holding on to all the tokens needed for requests to my user account.

    Creating a Class for Modularity

    Although this could easily be a single script, I couldn't pass on the opportunity to bundle this code up in some way. I could see this project being extended to handle other playlists, like the Year in Review Playlists.

    Maintaining state was a bit cleaner in writing a class as well. Storing the Spotipy instance and several other reusable pieces of state such as the list of tracks kept all the necessary information stored and self contained, ready for use by the class methods.

    Error Handling in Python

    My first and primary scripting language is JavaScript. Like many other languages, error and exception handling is not necessarily a beginner topic. So it was surprising to me to be accounting for exceptions so early in my Python coding.

    Handle them, I did. Each method is wrapped in a try / except block and logs messages unique to each function, to help keep track of where things will go awry.

    AWS Lambda

    The script wouldn't be much of an improvement if I still had to open up a terminal and run it manually! Uploading to AWS as a Lambda function made sense since it's such a lightweight script that is purely interacting with Spotify's web API.

    I used the Serverless Framework to streamline the process. Initializing the project with their CLI and customizing the config file, I was able to create a Cron Event Handler to fire off the function every Monday at 7:00 AM.

    Playlists Created on Request

    One interesting thing I've noticed about the playlists is that on Mondays when I open up the official Discover Weekly playlist in the desktop App, it will sometimes still show the previous weeks playlist, and then later update with the new tracks for the current week.

    I initially thought this would mean that Spotify only updates the playlists after you make a request. If my script ran before that initial access, then it may be saving an old playlist instead of the newly generated one.

    However, in practice, it seems it may actually have more to do with Spotify's app cache taking time to update. On logging out results from pinging the endpoint for the current Discover Weekly tracks, both from the first load and from a delayed request, both returned the new tracks appropriately. No need to change my code, but an interesting point to explore

    Try It Out!

    If you, too, are an exceptional music nerd, you can give my script a whirl yourself! You can find my code here at this github repo link with guidelines for setting up AWS Lambda and Serverless.


    30 on 30

    My first draft of this, I'll be honest, waxed poetic on time and identity. I wrote about my Saturn Return, the transient nature of reality, and how we are, in essence, a part of the universe observing itself.

    BUT THAT'S NO FUN!

    So instead, here's my listicle of 30 lessons learned leading up to this big, hairy landmark.

    30 Lessons

    1. Enthusiasm is the most important compass. I think about this a lot when it comes to planning my own future. I have no idea what will make me happy tomorrow. And that's ok! There will always be something to be excited about and moving towards!

    2. Everything has diminishing returns at some point. Money, networking connections, living in excess, even living a balanced life to a degree. Aim for 80% in most things, that's the sweet spot of effort and reward.

    3. Life happens in seasons. If you're a high achieving type, it's easy to fall into the idea that production should always stay high. But we need those slow periods for reflection and recharging. A cliche at this point. But really feeling this on a month to month, year to year, decade to decade level has been powerful.

    4. Eat well. Seriously.

    5. Sleep. Another boring one, but come on! It's really important! I can hear you now: "What's next, are you going to tell us to exercise?!"

    6. Exercise. Get out of your head and into your body. A good walk is great medicine.

    7. It all works out in the end. It's hard to know this without having a few lived experiences of genuine challenge under your belt. I feel like I'm just getting there. But trust me. It all works out in the end. In every way, this too shall pass.

    8. Success is not a direct result of effort. Don't get me wrong, effort is wildly important. But it's actually effort multiplied by a much, much, much larger variable of luck, and a third variable of resources (eg "talent" or inclination). A big lesson late in my 20s has been to accept this and use it. Work steadily, stay humble, and look out for open opportunities. It's much more enjoyable than the brute force method.

    9. On n'arrive jamais. One never arrives. (Quoting musicians here for you Eugene Rousseau / Marcel Mule fans!) The anticipation is greater (and really lots more fun!) than the realization. Take time

    10. Keep in touch. Doing this in an intentional, genuine routine is an easy way to get the ol' warm n fuzzies.

    11. Don't take anything too seriously. Seriously.

    12. Back up your files!! I grew up having to reinstall windows on our home PC every couple of years. It wasn't a big deal when it was just kidpix files on there. But now that all of my work is digital, it's a necessity.

    13. Make time for personal creative projects. Even when what you do for work is creative. This has been a lifeline for me. There are so many reasons for it. It's fun, you learn so much by doing it, you discover identity through it. And anything works! Blogging, Twitch streaming, fan fiction. Actually, Vonnegut says it best: ". . . Practice any art, music, singing, dancing, acting, drawing, painting, sculpting, poetry, fiction, essays, reportage, no matter how well or badly, not to get money and fame, but to experience becoming, to find out what's inside you, to make your soul grow."

    14. Attention is the greatest gift to give and receive. Paraphrasing from Simone Weil, as discovered on the blog formerly known as Brain Pickings.

    15. Acceptance as a horizon. Just starting on the path of learning this one. Probably the most important one on the list. Acceptance of others and self is wildly intertwined. Part of growing up is simultaneously being open to the differences in others and yourself. A tricky thing, too big for a listicle!

    16. Beware the differences between your genuine values and societal values. Again, enthusiasm helps here in parsing which is which.

    17. There's greater wisdom in the gut than we give credit for. Some of my better decisions were against reason and in favor of intuition.

    18. Be who you are now. A lesson from teaching music to kids. Pardon the philosophical bent here: A 6th grader's purpose isn't to grow up or to learn all their scales for 7th grade. It's to be a 6th grader. We're all working towards something, but losing sight of who we are now takes away from the unique joys of where we are. The best lessons I taught were ones where we savored enthusiasm. Particularly for beginners, savoring the newness of learning a song they were inspired by. (Sometimes it was Megalovania...actually, most of the time it was Megalovania.) And yeah, then we did some scale work too.

    19. Do something for work, and something else for creativity's sake. I'm here to say it's true, both halves make a greater whole. The nice thing is that the vehicle for money can be inspiring too — coding and music both support each other creatively for me.

    20. Books are great. Go pick one up! Remember how CRAZY BONKERS it is that you and I are connecting minds right now across TIIIIME AND SPAAAACE - through the magic of printed text!

    21. Invest in your tools. When I started at UT, I was simultaneously playfully poked at for playing on awkward mouthpieces, and I was praised for making them work. BUT after buying newer, nicer setups, it was just easier to sound good and more fun to play the dang horn!

    22. You don't need to be a gear-head. Then again, I was learning to code on a $200 chromebook that I had to install linux onto. Build times took ages. But it got me here. 🤷‍♂️

    23. There's so much time. Back to no. 18. Not so much a lesson as much as an observation. The 20s to me felt like a race to Arrive and find stable ground. Once you have it, somewhere between 28 and 36 for most folks I talk to, the world opens up. So savor whichever stage you're in, the striving or the sustaining. Both have their own beauty on the journey.

    24. It's ok to give up on something partway through! Thanks for reading! 👋


    Adding RSS Feed to Next.js with SSR

    I'm a big blog nerd. Growing up, I subscribed to my favorite webcomics. I mourned the death of Google Reader. I love the spirit of blogging today as an alternative, slow paced social media.

    Naturally, I HAD to get one going on this site!

    There are several great resources for getting a feed going with SSG and Next.js. This one was a favorite. Here, I'm going to add my experience setting it up with a SSR Next site.

    The Sitch

    Here's what static site solutions suggested:

    • Write your rssFeedGenerator function
    • Add the function to a static page's getStaticProps method
    • On build, the site will generate the feed and save it in a static XML file.

    The issue for my use case is that my site is leveraging Server Side Rendering. I'm doing this so I can upload a post that is scheduled to release at a later date. With a static site, I would be stuck with old data and the post wouldn't release. With SSR, there is a simple date comparison that filters published and scheduled posts.

    So, since we have a Server Side Rendering solution for pages, we need a SSR solution for the RSS feed.

    Rendering RSS Feed from the Server

    I'll briefly start with the code to generate the XML file for the feed. I'm creating a generateRSSFeed method that largely looks similar to the one described in this guide.

    That gets passed to my handler getRSSFeed.

    export async function getRSSFeed() {
      const posts = await getAllPostsWithConvertedContent(
        [
          'title',
          'date',
          'slug',
          'author',
          'coverImage',
          'excerpt',
          'hidden',
          'content',
        ],
        {
          filter: filterBlogPosts,
          limit: 10,
        }
      );
    
      const feed = generateRSSFeed(posts);
      return feed;
    }

    lib/api.js

    And here's the tweak: I'm using the method in the api routes folder instead of getStaticProps.

    import { getRSSFeed } from '../../lib/api';
    
    export default async function handler(req, res) {
      const xml = await getRSSFeed();
      res.setHeader('Content-Type', 'application/rss+xml');
      res.send(xml);
    }

    pages/api/feed.js

    Instead of generating a static file and saving it to our assets folder, here we're serving it up from the API directly.

    And that's it! Once the time passes on a scheduled post, the next request to the feed will include that latest post!


    Balancing New and Familiar Tech

    After developing this site, I realized that getting started was the hardest part.

    When I set out to build it, I had a clear vision for what I wanted to accomplish. I also had a very ambitious set of tech I wanted to learn along the way.

    Learning It All

    I was inspired with this project to roll my sleeves up and get close to the metal. At work, I design web apps with React, Meteor, Mongo, and several other tools that make life easy. I was hungry to balance it with a real challenge.

    To me, that meant:

    • Writing blog posts in markdown
    • Converting markdown to html**
    • Handling my own routing by picking Express back up
    • Learn a new templating language
    • Deploying to a higher "professional standard"
    • Handling image hosting
    • Optimizing images
    • AND MORE

    Basically, I wanted to hand code as much as I could without any help!

    Getting Stuck

    This went nowhere fast.

    After getting an Express server up, I was deep in decision fatigue. I was having to make unique choices about so many details. I had to learn as I went with a greater number of libraries and tech. I had very little that felt familiar in front of me.

    And so I was stuck motivationally.

    Pareto's Principle

    If you're unfamiliar, Pareto's Principle is the idea that roughly 80% of consequences come from 20% of causes, and vice versa.

    The principle is popular in business. 80% of revenue comes from 20% of clients.

    I realized while in the weeds that it's a fair ratio for development and learning new tech, too.

    80/20 Rule in Tech

    So, ego got checked at the door. I scaled back the "newness" of what I was doing by picking familiar tech - React, Next.js, hosting on Vercel, AWS.

    I then experienced the sweet spot of balancing new technologies and features while building the site.

    My final balance with this tech looked like this:

    • I was familiar with 80% of what I was working with. (React, Next, Vercel, AWS)
    • I was unfamiliar with 20% of the tech I was working with (Hosting Markdown and Image Optimization)

    I found flow with that ratio. It's when I was trying to work with more of a 50/50 balance that I lost momentum. When I was trying to get back into Express with a new templating language, serving static files, AND all of the above new tech, I stalled.

    Finding the right balance kept me productive, happy, and still learning a great deal along the way.