• Adventures of a very old Jekyll Setup



    A waterfall of errors

    So I've been writing on an old Jekyll website of mine.
    Sooner or later I promised myself to move it to Hugo, however it's #983 on the TODO list.
    For now, during the build I have been bombarded with the following 5 errors on repeat:
    1
    2
    3
    4
    5
    
    /var/lib/gems/2.7.0/gems/jekyll-3.1.2/lib/jekyll/convertible.rb:42: warning: Using the last argument as keyword parameters is deprecated
    /var/lib/gems/2.7.0/gems/jekyll-3.1.2/lib/jekyll/document.rb:265: warning: Using the last argument as keyword parameters is deprecated
    /var/lib/gems/2.7.0/gems/jekyll-3.1.2/lib/jekyll/tags/include.rb:169: warning: Using the last argument as keyword parameters is deprecated
    /var/lib/gems/2.7.0/gems/jekyll-3.1.2/lib/jekyll/url.rb:119: warning: URI.escape is obsolete
    /var/lib/gems/2.7.0/gems/rb-inotify-0.9.7/lib/rb-inotify/watcher.rb:66: warning: rb_safe_level will be removed in Ruby 3.0
    Obviously the real answer here would be to upgrade my Jekyll setup to ensure Ruby stops complaining about my ancient stack.
    I have promptly placed upgrading my stack as item #984 on the TODO list (as this is running in isolation, it's not a huge threat).

    For the time being, I am using the following command to silence the noise:
    1
    
    bundle exec jekyll serve 2>&1 | egrep -v 'deprecated|obsolete|rb_safe_level'
  • AWS S3 Sync in GitHub Actions



    Using GitHub Actions to deploy your static website to S3?

    Great!

    Using AWS S3 Sync command?

    Amazing!

    Did you make sure you have the `--size-only` flag on?

    Without `--size-only`, your entire bucket will be uploaded each time your push to your branch. This includes contents that haven't changed at all.

    Why?

    By default, AWS S3 Sync compares the timestamp of your source and destination files. If the timestamp doesn't match - the file will be uploaded. This is a problem, since the files are newly created on each build (they are checked out as part of the workflow). To ensure only size is compared, use the `--size-only` flag like so:
    1
    
    aws s3 sync ./ s3://YOUR_BUCKET_NAME --size-only --exclude '.git/*' --exclude '.github/*'
    p.s. You may have noticed I've also excluded the .git/ and .github/ folders.

    You're welcome ;)
  • The Fulfilled Promise of Serverless



    I respectfully disagree with the arguments made on Last Week in AWS against Serverless.

    Regarding portability issues

    The problem described is down to architecture. Do you need to add logic that handles the API Gateway? Yes. Does it have to be an integral part of your Lambda functions? No. In the majority of Serverless stacks that I've seen, the handling of the API Gateway event (think auth, middleware, different content types, etc.) is done in a separate package and can be relatively easily re-implemented for another Cloud provider. I do accept the premise of AWS Step Functions being the exception to this rule, but I don't believe they are absolutely necessary (nor do they feel productive to work with). Most Step Functions structure can be replaced by either a Queue or Messaging system, depending on the requirements.

    Regarding the perceived value fallacy

    Essentially the argument is about the limited impact savings Serverless could bring to the table compared to the overall budget. One critical thing isn't taken into account - The inevitable DevOps department. I have seen plenty of early stage startup with 5-10 engineers and zero DevOps. Nearly in every one of those cases they were able to do that by going either fully managed (e.g. Vercel, Auth0) or fully Serverless. Those saved salaries count for much more than the net savings on the infrastructure budget.

    Regarding the difficulty of collaboration compared to WordPress

    It is unclear in your argument whether those collaborators were writers or engineers, but I'm assuming writers. I do agree that it is much easier to write articles using WordPress than your home-made lean Serverless solution. This however completely misses the point. It is not WordPress the writers are missing, it is the WYSIWYG editor (In WordPress's case - Gutenberg). There are plenty of full-featured Markdown editors that can directly save to GitHub. A website generator such as Hugo could easily connect those repositories with your website and presto! Writers are happy again.
  • Dropbox ignore folders like node_modules



    Great Dropbox tip:

    If you have code on Dropbox, and are annoyed by node_modules sync ups (or just want to avoid syncing certain folders), there's a new way to solve your endless sync problems!

    Dropbox has introduced a new attribute you could add to your files/folder to make sure they're ignored!

    On Linux, you could simply create the following alias:
    alias dignore="attr -s com.dropbox.ignored -V 1"
    Then simply use your new dignore command like so:
    dignore path_to_file_or_folder


    Important: make sure no active sync is running for the folder, as it could cause "Ignored Item Conflict"
    More info on Dropbox Ignore

    Dropbox Ignore VS Exclude

    Dropbox Ignore - when you don't want Dropbox to upload a certain file/folder to the Dropbox Cloud. Dropbox Exclude - when you don't want Dropbox to download a certain file/folder to your local machine.

    That's pretty much it!
  • Packaging NodeJS scripts into a Binary



    Hey Node devs, jealous of Go devs for being able to create single binaries without any dependencies?
    Now you can do it too! With pkg - a binary compiler for NodeJS!

    pkg - from NodeJS to binary

    I recently developed a NodeJS script that automatically consumes SQS messages when an EC2 instance is initiated.
    My most major issue has been the dependencies - NodeJS installation was often slow, and would sometimes fail or timeout.
    On top of that, I had to also install the AWS SDK in order to get the SQS messages.
    The entire thing was too much of a delay for my User Data script to handle.

    Side Note: User Data is a script that executes when an EC2 instance is initialized. Very useful for pulling Docker images or running init scripts.

    After an evening of debugging, I started thinking - this would be a whole lot easier with Go, as I would just have a single binary file and zero dependencies to install.
    The issue was that converting my entire script to Go would've required much more time, and would likely create new issues (as AWS SDK varies between the languages, not to mention I'm much more experienced with NodeJS compared to Go).
    I then thought that within the infinite universe of NPM there must be a tool that packages binaries out of NodeJS scripts, or at least somehow bundles the NodeJS runtime together with my script.

    pkg to the rescue!

    pkg allows you to bundle your NodeJS scripts into a single binary.
    No more dependencies, no more node_modules/, package.json, npm install...
    Simply download the single executable and... well... Execute!

    I have successfully tested the following with pkg:
    • NodeJS environment (obviously) - includes Node8, Node6, and Node4 (9 is not supported yet)
    • Operating System - includes linux, mac, and windows
    • NPM Libraries (aws-sdk in my case) - works without any configuration
    • Environment Variables - works without any configuration
    • Spawning Child Processes - works without any configuration
    • Async/Await - works without any configuration

    Install pkg

    npm install -g pkg
    Tip: it also works if you install it locally, you would simply have to run:
    ./node_modules/.bin/pkg
    instead of just pkg

    Run pkg

    pkg --targets node8-linux server.js -o server-linux
    The command above simply bundles server.js into a linux binary named server-linux



    Until next time!