Chris Padilla/Blog


My passion project! Posts spanning music, art, software, books, and more. Equal parts journal, sketchbook, mixtape, dev diary, and commonplace book.


    Geeking Out Over Notion

    You guys, I'm just really jazzed about a piece of software over here.

    I think for anyone that codes, there's just a little bit of the person who organizes their sock drawer in us all. Organization and systems are a big part of the job. And so, our project management has that element to them too.

    In that arena, Notion has just been SO pleasant to use.

    Kanban Board

    My primary use for it is the board view. This alone has been huge, and maybe actually, this is more of a blog about why kanban boards are the best.

    Let me set the scene: Jenn and I start our big game development project together. We're excited, energy is high, and we have lots of brain-space mutually for where things are and what our individual tasks are.

    Then the project gets BIG. The list of features are long, a log of bugs crop up, and we don't have a unified spot to keep notes on individual features.

    Notion Board with Cards under Analysis, Development, and Awaiting Input

    The beautiful thing about a board is that we can keep track of multiple features, ideas, and bugs. From a glance, we can see what's on deck for developing, researching, and giving feedback.

    What's especially cool about Notion's boards is that you can open the card up into it's own document!

    Say that Jenn makes a card called "Add Pizza to Inventory."

    We have a comment function on the card where we can have a conversation over what toppings should the pizza have, when to add the pizza to the inventory, and so on.

    Under that is space for writing on the document. Anything goes here - adding screenshots, keeping a developer todo list, keeping notes from research. So all the details around that feature is kept in one spot.

    What happens often is that we'll talk about an idea, leave it for months, and then have to come back to it. With a comment thread and notes from development, it's that much easier to pick it up and work on when the time comes.

    Guides and Meeting Notes

    Notion is mostly marketed as one of those "everything-buckets", similar to Evernote or Google Drive.

    I'm personally a believer in plain text and just using your file system for note keeping. But, collaboratively, having a hub for all things project related is unbeatable.

    On top of our progress with the board, we used documents for writing meeting notes and keeping track of guides for using Sanity. We both always have the most up to date info, as Notion syncs automatically with any changes either of us makes.


    Automatically Saving Spotify Weekly Playlists

    With friends, I've been talking for ages about how channels of communication have different feelings and expectations around them. Sending a work email feels different from sending an instagram DM, which feels different from texting, which feels different from sending a snap on Snapchat.

    For me, the same is true for music apps. I have Spotify, Tidal, Bandcamp, and YouTube accounts with different musical tastes and moods. Especially since these apps all have algorithms for recommending music, I like each to be tuned into a certain mood.

    It just feels strange having a Herbie Hancock album recommended next to the new Billie Eilish, even though I would listen to both!

    SO I have my Spotify Discover Weekly playlist fine tuned to curate a great mood for work with mostly instrumental music. BUT I have to manually save the playlist every week, or else it's gone to the ether.

    Naturally, I was looking to automate the process! Having worked mostly in JavaScript and React so far, I saw it as a great chance to explore scripting in Python.

    What It Does

    This light script does just a couple of things.

    Of course, it gets and reads the current Discover Weekly playlist data, creates a new playlist, and adds all the new tracks to that playlist.

    It also implements a custom naming convention. I have sock-drawer-level organization preferences for naming these playlists. I like to name these by the first track name and the date the playlist was created. Names end up being:

    • An Old Smile 04/05/22
    • Mirror Temple 03/28/22
    • Apology 3/21/22

    This includes a little bit of trimming – Some track names end up being ridiculously long, sometimes nonsensicle. (looking at you, ⣎⡇ꉺლ༽இ•̛)ྀ◞ ༎ຶ ༽ৣৢ؞ৢ؞ؖ ꉺლ, an actual artist recommendation.) So there's a very simple shortening of the name if needed.

    Using Spotipy and the Spotify Web API

    Spotify already has an exposed web API for doing just what I needed – creating playlists and adding tracks. Doing so, like other OAuth applications, requires providing user authentication and scopes.

    To simplify the authentication and communication, I opted for the lovely Spotipy library. Simple and intuitive, the library handles the back and forth of authenticating the application with Spotify's Web API and holding on to all the tokens needed for requests to my user account.

    Creating a Class for Modularity

    Although this could easily be a single script, I couldn't pass on the opportunity to bundle this code up in some way. I could see this project being extended to handle other playlists, like the Year in Review Playlists.

    Maintaining state was a bit cleaner in writing a class as well. Storing the Spotipy instance and several other reusable pieces of state such as the list of tracks kept all the necessary information stored and self contained, ready for use by the class methods.

    Error Handling in Python

    My first and primary scripting language is JavaScript. Like many other languages, error and exception handling is not necessarily a beginner topic. So it was surprising to me to be accounting for exceptions so early in my Python coding.

    Handle them, I did. Each method is wrapped in a try / except block and logs messages unique to each function, to help keep track of where things will go awry.

    AWS Lambda

    The script wouldn't be much of an improvement if I still had to open up a terminal and run it manually! Uploading to AWS as a Lambda function made sense since it's such a lightweight script that is purely interacting with Spotify's web API.

    I used the Serverless Framework to streamline the process. Initializing the project with their CLI and customizing the config file, I was able to create a Cron Event Handler to fire off the function every Monday at 7:00 AM.

    Playlists Created on Request

    One interesting thing I've noticed about the playlists is that on Mondays when I open up the official Discover Weekly playlist in the desktop App, it will sometimes still show the previous weeks playlist, and then later update with the new tracks for the current week.

    I initially thought this would mean that Spotify only updates the playlists after you make a request. If my script ran before that initial access, then it may be saving an old playlist instead of the newly generated one.

    However, in practice, it seems it may actually have more to do with Spotify's app cache taking time to update. On logging out results from pinging the endpoint for the current Discover Weekly tracks, both from the first load and from a delayed request, both returned the new tracks appropriately. No need to change my code, but an interesting point to explore

    Try It Out!

    If you, too, are an exceptional music nerd, you can give my script a whirl yourself! You can find my code here at this github repo link with guidelines for setting up AWS Lambda and Serverless.


    30 on 30

    My first draft of this, I'll be honest, waxed poetic on time and identity. I wrote about my Saturn Return, the transient nature of reality, and how we are, in essence, a part of the universe observing itself.

    BUT THAT'S NO FUN!

    So instead, here's my listicle of 30 lessons learned leading up to this big, hairy landmark.

    30 Lessons

    1. Enthusiasm is the most important compass. I think about this a lot when it comes to planning my own future. I have no idea what will make me happy tomorrow. And that's ok! There will always be something to be excited about and moving towards!

    2. Everything has diminishing returns at some point. Money, networking connections, living in excess, even living a balanced life to a degree. Aim for 80% in most things, that's the sweet spot of effort and reward.

    3. Life happens in seasons. If you're a high achieving type, it's easy to fall into the idea that production should always stay high. But we need those slow periods for reflection and recharging. A cliche at this point. But really feeling this on a month to month, year to year, decade to decade level has been powerful.

    4. Eat well. Seriously.

    5. Sleep. Another boring one, but come on! It's really important! I can hear you now: "What's next, are you going to tell us to exercise?!"

    6. Exercise. Get out of your head and into your body. A good walk is great medicine.

    7. It all works out in the end. It's hard to know this without having a few lived experiences of genuine challenge under your belt. I feel like I'm just getting there. But trust me. It all works out in the end. In every way, this too shall pass.

    8. Success is not a direct result of effort. Don't get me wrong, effort is wildly important. But it's actually effort multiplied by a much, much, much larger variable of luck, and a third variable of resources (eg "talent" or inclination). A big lesson late in my 20s has been to accept this and use it. Work steadily, stay humble, and look out for open opportunities. It's much more enjoyable than the brute force method.

    9. On n'arrive jamais. One never arrives. (Quoting musicians here for you Eugene Rousseau / Marcel Mule fans!) The anticipation is greater (and really lots more fun!) than the realization. Take time

    10. Keep in touch. Doing this in an intentional, genuine routine is an easy way to get the ol' warm n fuzzies.

    11. Don't take anything too seriously. Seriously.

    12. Back up your files!! I grew up having to reinstall windows on our home PC every couple of years. It wasn't a big deal when it was just kidpix files on there. But now that all of my work is digital, it's a necessity.

    13. Make time for personal creative projects. Even when what you do for work is creative. This has been a lifeline for me. There are so many reasons for it. It's fun, you learn so much by doing it, you discover identity through it. And anything works! Blogging, Twitch streaming, fan fiction. Actually, Vonnegut says it best: ". . . Practice any art, music, singing, dancing, acting, drawing, painting, sculpting, poetry, fiction, essays, reportage, no matter how well or badly, not to get money and fame, but to experience becoming, to find out what's inside you, to make your soul grow."

    14. Attention is the greatest gift to give and receive. Paraphrasing from Simone Weil, as discovered on the blog formerly known as Brain Pickings.

    15. Acceptance as a horizon. Just starting on the path of learning this one. Probably the most important one on the list. Acceptance of others and self is wildly intertwined. Part of growing up is simultaneously being open to the differences in others and yourself. A tricky thing, too big for a listicle!

    16. Beware the differences between your genuine values and societal values. Again, enthusiasm helps here in parsing which is which.

    17. There's greater wisdom in the gut than we give credit for. Some of my better decisions were against reason and in favor of intuition.

    18. Be who you are now. A lesson from teaching music to kids. Pardon the philosophical bent here: A 6th grader's purpose isn't to grow up or to learn all their scales for 7th grade. It's to be a 6th grader. We're all working towards something, but losing sight of who we are now takes away from the unique joys of where we are. The best lessons I taught were ones where we savored enthusiasm. Particularly for beginners, savoring the newness of learning a song they were inspired by. (Sometimes it was Megalovania...actually, most of the time it was Megalovania.) And yeah, then we did some scale work too.

    19. Do something for work, and something else for creativity's sake. I'm here to say it's true, both halves make a greater whole. The nice thing is that the vehicle for money can be inspiring too — coding and music both support each other creatively for me.

    20. Books are great. Go pick one up! Remember how CRAZY BONKERS it is that you and I are connecting minds right now across TIIIIME AND SPAAAACE - through the magic of printed text!

    21. Invest in your tools. When I started at UT, I was simultaneously playfully poked at for playing on awkward mouthpieces, and I was praised for making them work. BUT after buying newer, nicer setups, it was just easier to sound good and more fun to play the dang horn!

    22. You don't need to be a gear-head. Then again, I was learning to code on a $200 chromebook that I had to install linux onto. Build times took ages. But it got me here. 🤷‍♂️

    23. There's so much time. Back to no. 18. Not so much a lesson as much as an observation. The 20s to me felt like a race to Arrive and find stable ground. Once you have it, somewhere between 28 and 36 for most folks I talk to, the world opens up. So savor whichever stage you're in, the striving or the sustaining. Both have their own beauty on the journey.

    24. It's ok to give up on something partway through! Thanks for reading! 👋


    Adding RSS Feed to Next.js with SSR

    I'm a big blog nerd. Growing up, I subscribed to my favorite webcomics. I mourned the death of Google Reader. I love the spirit of blogging today as an alternative, slow paced social media.

    Naturally, I HAD to get one going on this site!

    There are several great resources for getting a feed going with SSG and Next.js. This one was a favorite. Here, I'm going to add my experience setting it up with a SSR Next site.

    The Sitch

    Here's what static site solutions suggested:

    • Write your rssFeedGenerator function
    • Add the function to a static page's getStaticProps method
    • On build, the site will generate the feed and save it in a static XML file.

    The issue for my use case is that my site is leveraging Server Side Rendering. I'm doing this so I can upload a post that is scheduled to release at a later date. With a static site, I would be stuck with old data and the post wouldn't release. With SSR, there is a simple date comparison that filters published and scheduled posts.

    So, since we have a Server Side Rendering solution for pages, we need a SSR solution for the RSS feed.

    Rendering RSS Feed from the Server

    I'll briefly start with the code to generate the XML file for the feed. I'm creating a generateRSSFeed method that largely looks similar to the one described in this guide.

    That gets passed to my handler getRSSFeed.

    export async function getRSSFeed() {
      const posts = await getAllPostsWithConvertedContent(
        [
          'title',
          'date',
          'slug',
          'author',
          'coverImage',
          'excerpt',
          'hidden',
          'content',
        ],
        {
          filter: filterBlogPosts,
          limit: 10,
        }
      );
    
      const feed = generateRSSFeed(posts);
      return feed;
    }

    lib/api.js

    And here's the tweak: I'm using the method in the api routes folder instead of getStaticProps.

    import { getRSSFeed } from '../../lib/api';
    
    export default async function handler(req, res) {
      const xml = await getRSSFeed();
      res.setHeader('Content-Type', 'application/rss+xml');
      res.send(xml);
    }

    pages/api/feed.js

    Instead of generating a static file and saving it to our assets folder, here we're serving it up from the API directly.

    And that's it! Once the time passes on a scheduled post, the next request to the feed will include that latest post!


    Balancing New and Familiar Tech

    After developing this site, I realized that getting started was the hardest part.

    When I set out to build it, I had a clear vision for what I wanted to accomplish. I also had a very ambitious set of tech I wanted to learn along the way.

    Learning It All

    I was inspired with this project to roll my sleeves up and get close to the metal. At work, I design web apps with React, Meteor, Mongo, and several other tools that make life easy. I was hungry to balance it with a real challenge.

    To me, that meant:

    • Writing blog posts in markdown
    • Converting markdown to html**
    • Handling my own routing by picking Express back up
    • Learn a new templating language
    • Deploying to a higher "professional standard"
    • Handling image hosting
    • Optimizing images
    • AND MORE

    Basically, I wanted to hand code as much as I could without any help!

    Getting Stuck

    This went nowhere fast.

    After getting an Express server up, I was deep in decision fatigue. I was having to make unique choices about so many details. I had to learn as I went with a greater number of libraries and tech. I had very little that felt familiar in front of me.

    And so I was stuck motivationally.

    Pareto's Principle

    If you're unfamiliar, Pareto's Principle is the idea that roughly 80% of consequences come from 20% of causes, and vice versa.

    The principle is popular in business. 80% of revenue comes from 20% of clients.

    I realized while in the weeds that it's a fair ratio for development and learning new tech, too.

    80/20 Rule in Tech

    So, ego got checked at the door. I scaled back the "newness" of what I was doing by picking familiar tech - React, Next.js, hosting on Vercel, AWS.

    I then experienced the sweet spot of balancing new technologies and features while building the site.

    My final balance with this tech looked like this:

    • I was familiar with 80% of what I was working with. (React, Next, Vercel, AWS)
    • I was unfamiliar with 20% of the tech I was working with (Hosting Markdown and Image Optimization)

    I found flow with that ratio. It's when I was trying to work with more of a 50/50 balance that I lost momentum. When I was trying to get back into Express with a new templating language, serving static files, AND all of the above new tech, I stalled.

    Finding the right balance kept me productive, happy, and still learning a great deal along the way.


    SSG vs SSR vs CSR

    While building my site, I did a deep dive into rendering. Next.js can serve up static files, client side rendering, and server side rendering. All on a page-by-page level, even! I wanted to define the pros and cons of each. Here's what I found:

    Static Files

    These are the fastest to serve up! Think a simple HTML file or image. There's nothing to process, the server just loads the file and sends it off. These are easily distributed to CDN's as well, so that speed translates all the way from California to Australia.

    Static Site Generation

    The benefits of static files, with the flexibility of templating. If data is stored in a DB or CMS and needs to be piped into your site, this is a great solution. On site build (say, when you push new code), static HTML files are generated from templates and pulled data. The data needs to be something relatively unchanging, as the site typically only builds once and then caches the statically generated HTML files.

    Next has some neat enhancements to this. Incremental Static Regeneration can regenerate your pages after build as your data is updated. You can either set this to a time interval or you can even connect your CMS to your app with a webhook. This way the site only regenerates when data is updated.

    Server Side Rendering

    As the name suggests, rendering happens on the server the same as with SSG. The difference is that it's on request as opposed to build. Build happens infrequently and is triggered by a specific event. Requests, however, are when a user requests your webpage. This has been the way of the web for decades, with php, Ruby on Rails, and even Node.js and Express.

    With several different rendering options now available, this method shines with data that changes fairly frequently. This is also a great solution for sites requiring user authentication, such as logging in to a portal to view your utilities bill.

    Client Side Rendering

    The new hotness, relatively speaking. A JS framework such as React or Vue is used to send a root div and a whole lot of JavaScript to dynamically render the page in the browser. Data is often pulled from several sources that are frequently updated. This is the solution of choice for building apps, dashboards, and anything requiring real time data.

    The Weird Middle-ground

    So, the above is a spectrum, typically trading site performance for the freshness of data served (to vastly oversimplify it.)

    What if you fit somewhere in the middle? Incremental Static Regeneration is really close to Server Side Rendering on the spectrum. Which do you go with then?

    My situation is that my data is not changing, but the conditions for rendering them does change. I finish writing my blog posts on one day, push them to the site, but then only want them to publish a week later.

    OK, so ISG would be great. You can time the interval of when to regenerate the page. What's the big deal?

    Decisions Depends on Volume

    Traffic. I'm just starting the site, so volume is pretty low. With ISG, the first visitor after the regeneration event gets a cached version of the site. Then, later visitors get the fresh one. But that first person is a big deal to me! If I were a national e-commerce site, no sweat. But I'm a local mom and pop shop on the internet.

    Not to mention ISG adds a layer of complexity and maintenance unto itself.

    So! My choice for the site is to go with SSR. I trade off the wicked fast benefits of SSG and ISG. In it's stead I have greater simplicity and the assurance that the few folks visiting my site at the start are getting the freshest content.

    As traffic increases, switching over to ISG is still an option thanks to Next's flexibility.


    My New Website! Details and Tech

    I'm very excited to have plowed some land and planted the seeds for my own garden on the web!

    The sites I've developed have represented big phases in my life. Moomoofilms.com was my portfolio for youtube sketches when I was a kid. After grad school, I put up a music teaching portfolio site for students. Starting in tech, I put together a landing page for all my projects.

    With chrisdpadilla.com, it feels like another step. A unified home for all the different wanderings I do in tech, music, and writing.

    So yes, websites are great! I would definitely recommend getting one!

    With the sentimental side of the site laid out, let's talk tech!

    Considerations

    Features

    Blogging is the main feature of the site. Aside from that, there's a little bit of static file hosting. I do love the idea of playing with full stack features in the future, though, so having access to server side code is also a necessity.

    Longevity

    At the same time, I want something that will last. This site will be my playground for experimenting with new code, but I don't plan on doing a Scott Tolinski level of regular refactoring.

    I started hacking sites in the 2000s. A lot has changed and improved since then, and I want to take advantage of where development is made easier! And I do want to balance that with also making the site portable.

    Performance

    On a structural level, I wanted the site to be performant and accessible. I grew up on view source, and I love when I stumble on a site where I can still find beautiful html in the developer tools. I'm a little old fashioned - even if I'm using modern tooling, I love the feeling of making a site similar to how I would have back when I was growing up. Simplicity just feels good!

    Tech

    Content in Markdown

    My first decision was to build a system where I owned my content and could easily move it as frameworks and CMS's come and go. I do a lot of my personal writing, reflecting, and note taking in markdown already, so writing the blog in markdown files was an easy choice.

    Since they are so lightweight, the posts are stored in the same repository as the code. Down the line, this also makes it really simple for updating the site. When ever I push a commit with a new post, the static site files will regenerate

    An Initial Detour

    My first iteration of the site was an Express server. This hit all the boxes at first:

    • It can easily handle blogging features, while being hugely extensible.
    • Express has been tried and tested. The MVC approach to building websites, also, is a classic method.
    • It would be performant, rendering static files.

    I was determined - I was going to hand code as much as I could and learn a great deal along the way!

    And then it got tedious. I'm up for a good challenge, but I found myself hitting decision fatigue very quickly.

    I needed a bit more help. Momentum is a key ingredient in my projects. If I kept at it with Express, I felt I would lose that momentum.

    Next.js

    I scrapped what I had and switched over to Next.js. Put simply, Next handles everything I was looking to do myself, but makes it effortless.

    Feature-wise, the framework is flexible enough to switch between Server Side Rendering and Static Site Generation on a page by page basis. On top of that, api routes are available to deploy serverless functions for any future features and integrations.

    That lends itself to great performance. There's potential to ramp up the performance through caching, edge function support, and built in image optimization.

    Next has been tried and tested. There's excellent community support. They're also up to version 12, after many successful iterations. I'm not worried about the technology disappearing.

    Asset Hosting on Amazon S3

    An incredibly cheap and easy solution! Next pairs really well here. next/image handles many key performance optimizations, so all I have to worry about is uploading assets onto the S3.

    Style and Design

    My design is intentionally simple. I'm not a full blown minimalist (my desk is always a mess!) But I appreciate a design that doesn't detract from the content.

    Structurally, everything is in one plane old CSS file. I'm using Custom Properties (variable) for some repeating values. But that's as fancy as it gets, though!

    Hosting on Vercel

    Again, a natural pair with Next. It comes with integrated deployment through Github, easy to access logs, and simple set up for SSR and SSG. Also pretty cheap!

    Challenge

    A future concern I have with the site is vendor lock in. Next works like a charm on Vercel. There's support for hosting on other platforms such as Digital Ocean and Netlify. It's hard to say at this point if staying on Next and hosting with Vercel will be the best choice in the future.

    At this time, I definitely needed to get up and running quickly! So I'm happy with my choice today. As I continue developing, I'm planning to decouple my personal server logic from Next's API.

    What I Learned During Development

    Working on this site has been filled with learning opportunities, both in hard and soft skills. For more on particular areas of learning, you can read my articles on the subjects:

    Launch and Beyond

    Now that the code is in a presentable spot, I'm ready to fill the pages up! I already have a few more blog posts and albums in the works. It feels great to have a central home for all of them.

    Here are a few selected inspirations for the site's design:


    Symbolic Links

    The Sitch

    I'm in a spring cleaning state of mind with my data!

    I keep a bunch of text files in a Journal/ folder on my computer. It's pretty similar to the hierarchy Derek Sivers lists in this post on writing in plain text.

    And then separately I have this blog where I store articles as markdown files in the codebase itself.

    personalBlog/
    |- _posts/
    |- components/
    |- util/
    etc...

    It's all on my computer, of course — but it feels strange to write in prose in a text editor like VS Code. It even feels strange to store article drafts in the same place as the code for the site.

    They are only a few clicks away, but they feel like far flung and very different spaces. So how do I handle keeping published articles and drafts both near each other and organized, but spacially being sourced somewhere that my blog has access?

    Alex Payne mentions in this article using Symlinks.

    Creating one looks like this:

    $  ln -s source symlinkDestination

    A very elegant and easy solution! The idea is that you can have a link to another file or directory within a completely different place. Like a regular URL link, but for local files.

    My original idea was to have a symlink in the repo for this codebase and have all my writing stored in the Journal folder, included published pieces. The issue is that symlinks are just that - links. When storing them in git and publishing on Github, the files themselves are not pulled in.

    So I swapped the direction. The symlink lives in my Journal directory, and the actual files are in the codebase. When I publish this article, I'm moving the file from Journal/blog/drafts into the symlink Journal/blog/_posts, which then moves it over to the appropriate folder in the code repo.

    It works beautifully on the command line.

    A nitty, gritty, small tool - but one that makes me unreasonably happy to use!

    A Side Note on Aliases

    Mac's have Aliases, which work in a very similar way. They are restricted to finder, though. I'm working mostly on the command line when I'm writing and working on code, and symbolic links are recognized both by finder and by linux systems.


    How to Learn Web Development

    My Path

    From 2019 to 2021, I taught myself modern web development. What began as a fun hobby eventually turned into a completely new and exciting career trajectory for me. I taught music at the time, and would take my spare moments between lessons and at nights hacking away at projects and learning from online resources.

    It was a blast and has changed my life. Well, obviously in the career department! But teaching myself to properly program was a surprising discovery in itself. I really impressed myself with how much I could figure out on my own with the enough resourcefulness. Plenty of folks I know benefitted from school, bootcamps, or tutors. But I would strongly encourage trying to self teach, if you're curious!

    Who This is For

    Absolute beginners! Intermediate learners. Anyone inbetween! If you are doing this as a hobby or are interested in working full time, then these will help along your journey.

    The most challenging part about self teaching is the lack of structure in the curriculum. This article will be a mix of what worked for me as well as what I would do differently if I could. The best of both worlds!

    It's also worth noting this is for full stack web development specifically. The gist is that web developers can be put in two camps: those that focus on solely UI (a more designer approach) or take a more balanced, logic handling approach (more engineering and data oriented). Both are highly valuable, and the materials below guide you through both, but I lean more in the direction of full stack vs pure front end in my experience.

    That said, let's start with how to approach these resources.

    The Mindset for Learning

    These are topics all unto themselves. Almost cheesy, but important enough to share:

    Focus on Foundations First

    Frameworks come and go. They also hide the tricky parts of using certain tech. Writing React is meant to be easier than vanilla JS once you get the hang of it. Get really good at vanilla first. Same for CSS and HTML — it's invaluable to be really comfortable with the basics first so that you can easily transition from tooling.

    Be Consistent

    Musicians will be familiar with this. Practicing saxophone 30 minutes every day is better than 4 hours once a week. The same is true of programming. You learn inbetween sessions, often, while washing dishes. The routine facilitates that.

    50% Rule

    Practice learning the way that artists do with the 50% rule. The gist is you should spend some time learning from materials - books, blogs, tutorials. An equal amount of time you should be creating (just for the sake of creating).

    It's both a sanity check and it's a way to actually deepen your learning. Doing this will actually instill the confidence that you can write a few lines of working JavaScript, or style a site. And it's more fulfilling to be stepping away from the tutorials and genuinely creating.

    Some resources are good about encouraging this. Books or videos sometimes come with practice problems. If not, make up your own!

    This is the hardest part. That's ok! It gets easier the more you do it. Start small - 5 minutes of study, 5 of practice. Then 15. 30. An hour, two — a day, a week. You'll be amazed at how quickly you grow this way.

    Resources For Learning Web Development

    JavaScript

    Start with JavaScript! HTML and CSS are easier, but refining JavaScript will take the most time. Eat the frog first and start here. For any HTML and CSS you need to know while learning, get the bare bone basics, just enough to start scripting. I used a bootcamps' prep course, though I'd recommend a book or alternative course.

    Head First JavaScript is a fun and readable guide. Free Code Camp also has a great course on JavaScript that's interactive, perfect for the 50% rule.

    HTML, CSS, and Refining JS: The Odin Project

    The real bulk of my learning was done through the Open Source Site The Odin Project. Wildly thorough, the project guides you through just getting started to building full stack apps. All along the way, you are challenged with suggested projects to try out what you learned.

    The site itself primarily links to other resources, all of great quality! It's a great demonstration of what your continued learning will look like on the job, but with the structure of a curriculum. Volunteers maintain it, so the content has stayed relevant through the years

    The course is an overview, and there are a few holes here and there. But the focus on what and how to learn each piece of the web development puzzle is invaluable.

    One note — It's ok to completely crash and burn on a project. I got stuck at one point on an object oriented JS project and simply needed to move on. That is ok! There are other sources that can help fill gaps below.

    Frameworks and Refinement

    To fill in those gaps, getting even more experience building apps and learning from others is crucial. The curriculum above is a great starting place, but it takes more portfolio development to feel really confident in development.

    Most of my favorite video courses are by Wes Bos. React, Next.js, and even his beginner JS course were a great, hands on way to see how individual pieces played together. They're fun, practical, and no-fluff guides.

    Level Up Tutorials are quick guides to getting up and running with a certain piece of tech. It's staggering how much Scott has put out on this site. Find what ever interests you at this point. Balancing the fundamentals with a piece of new and interesting tech like GraphQL or Gatsby at this point is fun! Not to mention, an indicator of passion and curiosity in development.

    These two also host a podcast called Syntax. I listened to their show a lot to get up to speed on what's modern, and also to plain old learn how developers speak. It's a bit easier to pick up context from conversation than just from text, in my opinion. And it's another way to learn while washing the dishes!

    Going Pro

    More on mindset: It will be a slog. Hundreds of applications may yield only a handful of interviews.

    The Odin Project does a great job of offering resources for this phase. I would just add this much:

    Algorithms

    They are a big chunk of it.

    Computer Science style algorithms weren't a big part of interviewing for me. But learning them still made me a much better JS dev. Colt Steel's course is a great way to get familiar with them.

    Cracking the Coding Interview is essential reading. Just reading the first half of the book and trying out the first few problems will be good practice. You don't need mastery here, just familiarity.

    The type of algorithms I did do were more toy problems, à la what's on CodeWars. A problem or two a day will keep your problem solving skills fresh.

    General Technical Knowledge

    One technical interview I took was a much more trivia-style. Google "React Interview Questions" and you'll find what I'm talking about.

    Spending a bit of time learning these (and the principles behind them!) and then adding them to a spaced repetition system helped me integrate them. Podcasts also helped here — this type of interview is mostly to gauge if you can speak like a developer.

    Portfolio

    For the big pieces on your portfolio, write out a short readme on what the app does, the stack, challenges, and major features. This is both for anyone looking at your work and for yourself when you interview.

    It's one thing to be able to talk abstractly about third party integrations. And another to say "Yeah, when I was building my Next app, I used the Stripe API to handle payments on the server. First I..."

    Open Source

    The classic paradox - entry level jobs ask for 2 years of experience. I've heard that client work or tutoring has helped people here. For me, I really enjoyed getting experience through Open Source.

    Look on Code for America's site to see if there is a chapter in your area. If not, some groups may still be working virtually. I'd recommend this over searching github for projects for a few of reasons:

    1. You'll likely get experience working with volunteers with different skills. Designers, researchers, and data analysts may work on the same project. The same is true with programming - you may be the front end expert in a team of backend devs.
    2. You'll learn fast the hard skills of programming in a group. Setting up a new dev environment, using git as a collaboration tool, and participating in code reviews are all part of the experience in open source and on the job.
    3. It's more fun!! The people here want to help you contribute and grow. The projects are for great causes. And writing code that genuinely serves other people in tandem with other volunteers is wildly fulfilling.

    Contributing to open source can be a commitment. It's well worth the effort, though. The confidence and support system you have through it can counter balance the challenge of applying for a job.

    Good luck!

    Take what works for you and leave the rest. Let me know if this helped! Self teaching can be an isolating path, so feel free to reach out and share where you are.


    Parsing Markdown in Node

    I'm writing my own markdown-based blog, and had a lot of fun getting into the nitty gritty of file reading and text manipulation. It takes a little more writing than off the shelf solutions, but I wanted to have more control and ownership over the process.

    Sample File

    I have a file written like so:

    ---
    title: Parsing Markdown in Node
    tags:
      - node
      - blog
      - tech
    date: 2022-05-19
    
    ---
    
    ## Sample File
    
    I have a file written like so:

    The main body of the post is pre-pended by metadata. I want to grab the metaData so it can be used in our formatting engine, and extract it from the post body separately.

    A few tools will help along the way:

    Libraries

    Node File System

    Built into node, I'll use fs to open the data and extract it as a string that can be parsed. readdirSync scans the given file directory for the folder I'm looking for. readFileSync will parse the file and return it's contents as a string I can later manipulate.

    Both these methods actually have asynchronous counterparts! My files are not resource heavy at all, so there's no need to run concurrent asynchronous calls. It could be handy for larger amounts of data, though.

    The path below is constructed with a variable passed in by the user, so I'll handle the case that they've entered a file that doesn't exist.

    const files = fs.readdirSync(path);
    const fileName = files.find((file) => {
        return file.includes('.md');
    });
    
    if (!fileName) {
        console.error('No file found');
        return res.sendStatus(404);
    }
    
    
    const markdown = fs.readFileSync(`_posts/${postName}/${fileName}`, 'utf8');

    Showdown

    With the string version of the file in handle, I'll ees Showdown to convert the body to html. It's simple and flexible, bidirectional, if you need it to be. Conversion takes just a few simple lines of code:

    const showdown = require('showdown');
    
    const converter = new showdown.Converter();
    const postHtml = converter.makeHtml(postBody);

    Regex

    From here, it's all string manipulation to get the data I need.

    Potentially, a splitting the string by the bars ('---') would be enough. Using regex, though, will keep the process more flexible, incase in the article I use the same bars to break a section.

    This regex will do the trick:

    '---(.*)---\n(.*)'

    The match method in JavaScript, returns separate capture groups as part of the returned array. The return value contains:

    • index 0: full match (the entire document in our case)
    • index 1: the first capture group, our tags
    • index 2: the second capture group, the post body

    From here, I just need the built in split, map, and trim methods to grab the data.

    Voilà!

    HTML post body and metaData received!

    Here's the full code:

    const fs = require('fs');
    const parseMarkdownPost = require('../utils/parseMarkdownPost');
    
    const showdown = require('showdown');
    
    module.exports = (markdown) => {
    
      const fileRegex = new RegExp('---(.*)---\n(.*)', 's');
      const splitMarkdown = markdown.match(fileRegex);
      if (!splitMarkdown || splitMarkdown.length < 3) {
        console.error('Misformatted document.');
        return res.sendStatus(404);
      }
      const [match, metaData, postBody] = splitMarkdown;
      const metaDataObj = {};
    ]
    
      metaData.split('\n').forEach((line) => {
        // Store into data object
        const [key, value] = line.split(':').map((item) => item.trim());
        // if tags, split into an array
        if (key === 'tags') {
          // Let's actually delineate tags by commas instead of -'s.
          const tags = value.split(',').map((item) => item.trim());
          metaDataObj[key] = tags;
        } else {
          metaDataObj[key] = value;
        }
      });
    
      // Convert to html
      const converter = new showdown.Converter();
      const postHtml = converter.makeHtml(postBody);
    
      return [metaDataObj, postHtml];
    };

    indexController.js

    const showdown = require('showdown');
    
    module.exports = (markdown) => {
      // Regex matches the bars, captures the meta data, and then goes on to capture the article.
      // The s (single line) option allows the dot to also capture new lines.
      const fileRegex = new RegExp('---(.*)---\n(.*)', 's');
      const splitMarkdown = markdown.match(fileRegex);
      if (!splitMarkdown || splitMarkdown.length < 3) {
        console.error('Misformatted document.');
        return res.sendStatus(404);
      }
      const [match, metaData, postBody] = splitMarkdown;
      const metaDataObj = {};
    
      // Parse metaData
      metaData.split('\n').forEach((line) => {
        // Store into data object
        const [key, value] = line.split(':').map((item) => item.trim());
        // if tags, split into an array
        if (key === 'tags') {
          // Let's actually delineate tags by commas instead of -'s.
          const tags = value.split(',').map((item) => item.trim());
          metaDataObj[key] = tags;
        } else {
          metaDataObj[key] = value;
        }
      });
    
      // Convert to html
      const converter = new showdown.Converter();
      const postHtml = converter.makeHtml(postBody);
    
      return [metaDataObj, postHtml];
    };

    parseMarkdownPost.js


    Compute! Gazette

    Cover of instruments floating (or sinking int) a Commodore 64

    Marveling at Compute! Gazette, a magazine for the Commodore's 8-bit home computers.

    Advertiesment from mag

    It's very 80s. It's also really sincere and amazingly technical.

    Programming Music and Sound article

    You could use the above music program if you special ordered a disk from the magazine. Or you could COPY THE ENTIRE PROGRAM BY HAND into your computer!!

    Spread of code

    Absolutely amazing. Also loving the directory of user groups. It's continually humbling to see coding as a historied practice, even if it's only from a few decades back.

    User Group directories. From Pensylvania to Mexico!

    All issues are on the internet archive. Cover from issue 73. Pages from issue 31.


    Building a Full Stack MERN Events App

    Mystery Cowboy Theater Preview

    Mystery Cowboy Theater

    Github | View app in browser

    Mystery Cowboy Theater is a fictional single screen theater that loves showing exclusively Mystery Science Theater 3000 films and episodes! This application displays a movie selector, current ticket order, and a movie editor.

    This project fulfills all CRUD operations in two areas: ticket orders and films being screened. After being populated by the database, the application state is contained in the app component and data is shared down to the other supporting components to allow instant updating app-wide. State is maintained through two means: local storage for ticket orders and MongoDB for movie showings.

    Tech used:

    • React
    • MongoDB
    • Mongoose ORM
    • NodeJS
    • Express
    • Custom SCSS
    • Passport
    • BcryptJS
    • Deployed to AWS Elastic Beanstalk

    Back-End and Front-End Communication

    My aim with this project was to bring together React and the Express backend into one application. Since my last major React project, I planned to add complexity with multiple components needing the same state data, making the project more modular, and hooking the app up with a database.

    Structurally, the app is contained within one Node project, with the front end’s subdirectory being run through a specific script in the package.json file. Heroku, the deployment platform, runs this particular script so that it can build and serve up the React application for the client as well as run the Express application. The two points communicate with each other through a REST API served up by the Express server and queried by the front end with axios.

    A brief note about CORS

    A challenge in the project was discovering that, although the application was wrapped in the same project, a way to handle Cross-Origin Resource Sharing was required to allow data to be passed from the client to the server. Thankyfully, the solution simply required bringing in a middleware package to handle this. Currently, the application does not use a custom options, however, a potential next step for a more robust application with sensitive data would be to specify approved origins with the CORS package configuration.

    Secure Authentication

    Editing movies requires being logged in as the theater owner. There are both client and server side security measures to ensure that the appropriate user is authenticated before editing. While contained within the same project, the authentication process treats the front-end and back-end as separate entities needing to communicate with one another. The best solution for this was to use JSON Web Tokens. I’ve used Passport to implement the creation and authentication of the token that users send in request headers through axios. To ensure passwords remain secure, bcrypt was brought into the project to encrypt newly created passwords and decode login requests.

    Single App Hosting on AWS

    This project is hosted as a single application on AWS through Elastic Beanstalk and updated with the Code Pipeline CI tool. You can read more about how I configured the project for AWS in my article Hosting a Node Express App on AWS Elastic Beanstalk. You can also read on the why and how of Rendering a React App from an Express Server.


    Building a Message Board with User Authentication in Express

    Secret Poets Society Preview

    Secret Poets Society

    Github | View app in browser

    The Secret Poets Society is a club of artists sharing work purely for arts sake. Not for fame or glory! As part of the practice, the authors names are unlisted to visitors.

    The site has three tiers of membership - guest, member, and admin.

    • GUESTS can sign up and login to post their own poems, joining in the art for art's sake movement.
    • MEMBERS have the added privilege of seeing which of their colleauges posted which poem and on what date.
    • ADMINS can see what members see and also have the ability to delete any poem they see fit.

    Tech used:

    • NodeJS
    • Express
    • MongoDB & Mongoose
    • Passport
    • Handlebars.js
    • Custom CSS

    MVC Design Pattern

    This Node application follows the Model-View-Controller approach to design. Files are organized into their roles within the system. The Mongoose ORM serves as the data model, Handlebars.js is used for templating views, and Express routes and middleware serve as the controllers. Controllers are the bulk of this application, handling data validation, sanitization, and user authentication.

    To handle authorization, the Passport package is used to coordinate login. Express Sessions are then used to maintain the user’s account session with the server. User data is stored in MongoDB, including encrypted login information and their membership tier. If a user is accessing a protected route, a specific controller is accessed. With the controller functions, the logged in user’s membership tier is checked against the permissions required for a given request. If they do not have the correct authorization, the user receives an error message rendered by the application. Otherwise, they are granted access.

    Data Security & Sanitization

    My greatest area of learning through this application was Security and sanitization. Incoming data has the potential to be incorrect or, worse, malicious. The Express Validator package is used to clean the data of any malicious code. Another method with the Validator package ensures the inputs match the expected types and provide the required fields.

    A special case is user login data, where the password needs an extra layer of security. A custom implementation of passport brings in BCrypt, a library for string encryption, to securely save a hashed password and decrypt it when the user is attempting to log in. This ensures if the network request is intercepted or if the database is accessed outside the application, critical data is not exposed.

    Challenges

    While it was a pleasure using a new UI library, I quickly discovered that Handlebars.js has it’s limitations. The library can handle a fair amount of conditional rendering. But, as the application grew more complex, I found myself trying to make views reusable through logic that Handlebars didn’t support. For example, sharing form elements from the login page and the account creation page. This comes from my experience of working heavily in React, a library that makes breaking a page into components accessible and flexible.

    The solution was to approach views from a page level, allowing there to be separate pages to share a fair amount of layout formatting, but to make the small tweaks necessary to better suit a page’s needs. Working with handlebars, it was still possible to utilize conditional rendering of items, such as the authors of the poems, while still allowing the server to perform the heavy lifting of logic.

    Hosting on AWS

    This project is hosted on AWS through Elastic Beanstalk and updated with the Code Pipeline CI tool. You can read more about how I configured the project for AWS in my article Hosting a Node Express App on AWS Elastic Beanstalk.

    Next Steps

    Much of what I learned in this project transitioned into a later application, Mystery-Cowboy-Theater. Another Node application, I took the pains I felt while using Handlebars’ page approach and switched over to React to break down the UI into components. Authorization was carried over into that project as well, this time using JWT’s to pass user data between the client and API.

    Poem Credits

    Authors are viewable by members. Major poets include Shel Silverstein, Walt Whitman, and Robert Frost.


    Building a Full Stack Next.JS E-Commerce App

    Taco Time Preview

    Taco Time

    Github | View app in browser

    Taco Time is a fictional restaurant taking inspiration from Taco Deli here in Austin, TX. With the need to showcase and allow customers to order an array of breakfast and lunch tacos, this application dynamically renders item pages and maintains a detailed cart that stores their orders and customizations. This intricate project employs multiple modern web development tools and techniques, including Server Side Rendering, interacting with a GraphQL API, running server-less functions, and dynamically rendering individual item pages with Next JS’s dynamic routes. Tech used:

    • Next.js
    • React
    • Redux
    • GraphQL
    • MongoDB
    • Mongoose ORM
    • Apollo Micro Server
    • Apollo Client
    • Styled-Components
    • React-Transition-Group
    • Stripe
    • Deployed to Vercel

    Server Side Rendering with Next.js

    Next.js allows for choosing between Server Side Rendering and Static Page Generation on a page-by-page basis. For this application, assuming owners may need to post notice that an item has sold out, I’ve opted for SSR. On the server, the application grabs the data it needs and renders the html that will be sent to the client. This process alone took a considerable amount of fine tuning as the application needs to interact with Apollo’s cache and await results from MongoDB. I was able to eradicate a bit of a nasty bug in this process by having the getServerSideProps function wait for the DB to establish a connection before rendering the page.

    Crafting API Resolvers in Apollo

    The back end is an Apollo Micro Server. Since this iteration of the app only needed to interact with MongoDB to fetch items in the restaurant inventory, a lighter server seemed a great fit! The server takes in hand-crafted GraphQL schemas and resolvers. The resolvers then fetch the data from MongoDB through interacting with Mongoose schemas. A mutation method was implemented for me to add items to the database as a developer. A possible expansion would be to create a view for restaurant owners to interact with an elegant UI to do this themselves.

    Integration with Stripe API for Customer Checkout

    The latest feature and my favorite to work through was bringing in Stripe’s checkout component and integrating with their api to charge credit cards for orders. In this implementation, testing mode is enabled so no charges are actually incurred, but test data is sent through the application. (Use card number 4242 4242 4242 4242) To ensure a secure checkout, the order is handled on the server. There, prices are recalculated with price information on items from the database to ensure correct charging. The order is then converted to a record in the database. Finally, an order is returned to the client with details on their purchase.

    Interacting with GraphQL API

    To interact with the API, Apollo Client is used within the SSR functions. To allow for flexibility, the client initialization method is written to check for whether it is being used on the server or client. The benefit of only grabbing the relevant data is best seen between the menu page and an individual items page. The menu only needs the name, image, description, and category of an item. The GraphQL query then only requests what it needs. The full item display pages, then, will request further data, such as customizations, options, and price.