Laravel on an Ubuntu VM

Development on a VM?

This sounds crazy, but I will be developing my Laravel apps on a VM. We don’t all have the luxury of having an over powered VM server running in our basement, but I do. If you also want deploy on a VPS running Ubuntu, you could follow these steps to have MySQL, NGINX, Redis and a running a Laravel web app. This is just a starting point.

I’m installing Ubuntu 22.04 on my VM. It gets 4 logical cores and 8 GiB of memory. Make sure you have the public SSH key for the machine your working on stored in your GitHub account before the installer finishes, since it allows you to import SSH keys automatically into your VPS as your setting it up, all you need to know is your github username. Once the install is complete, SSH into the VM.

The next hurdle is installing PHP. If you’re using Ubuntu 22.04, then the default version of PHP is 8.1, but Laravel 11 requires PHP 8.2. In order to do so, you will have to add the PHP repository to apt using the following command:

sudo add-apt-repository ppa:ondrej/php

With the repository added, you can’t just install PHP8.2 by itself. If you do it will install apache2 as a dependency by default. It’s a fine server, but we will be using NGINX. In order to avoid apache2, you need to install a set of packages that include php-fpm or php-cli. We will be using PHP FPM to handle our application so we’re going to install PHP and our other services; Redis, MySQL, and NGINX with the following command:

sudo apt install php8.2-fpm php8.2 unzip php8.2-curl php8.2-xml php8.2-mbstring php8.2-redis redis-server php8.2-mysql mysql-server nginx

If you’d like to see what packages are installed before you install them, tag the flag --dry-run on the end of the command. It will show you the packages to be installed and prove that libapache is no where to be seen! This command includes a few extra PHP module packages that are required by Laravel 11.

The next step is to install Composer, which can be accomplished many ways, but the way recommended by the Composer team is to use their install script:

php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');"
php -r "if (hash_file('sha384', 'composer-setup.php') === 'dac665fdc30fdd8ec78b38b9800061b4150413ff2e3b6f88543c636f7cd84f6db9189d43a81e5503cda447da73c7e5b6') { echo 'Installer verified'; } else { echo 'Installer corrupt'; unlink('composer-setup.php'); } echo PHP_EOL;"
php composer-setup.php
php -r "unlink('composer-setup.php');"

If you copy and paste that into your terminal, it will download and check the hash of the install file, then you can use the following command to move it into your binaries folder:

sudo mv composer.phar /usr/local/bin/composer

Configuring NodeJS

To install NodeJS and NPM, use the following command:

sudo apt install nodejs npm

Unfortunately, the version of Node that ships with Ubuntu 22 is well behind the stable version and there are many ways to get stable, but I find the that the easiest ist to globally install the n package via npm:

sudo npm install -g n

The n package is able to update NodeJs with this comically short command:

sudo n stable

You will probably have to leave your SSH session or restart your terminal session in order for the changes to NodeJS to take effect.

Configuring NGINX

Although NGINX can be configured to serve a single site, I would recommend using it’s built-in multi-site configuration by adding a valid configuration to the /etc/nginx/sites-available directory and then add a symbolic link from that file to the /etc/nginx/sites-enabled directory. The Laravel documentation has an NGINX config, which you should use as a starting point. If you make configuration file named laravel.conf the command to symlink it would be:

sudo ln -s /etc/nginx/sites-available/laravel.conf /etc/nginx/sites-enabled

The default user associated with the NGINX service is named www-data; therefore any files that are being served by NGINX need to be owned by the www-data user or managed by the www-data group.

Setting permissions on your application

Whether you’re starting a new Laravel project or cloning a repository, you will have to set permissions on your codebase such that the www-data user will be able to access the files. By convention, the www-data user owns the /var/www directory, but if you deploy your application there using your username, the permissions will result in an error if you try to load any of your applications routes. After you have cloned or copied your sourcecode into a subdirectory of /var/www, you should install your Composer and NPM packages and build your front end. Afterwards, run the following commands to set permissions and file ownership:

sudo chown -R $USER:www-data .

This makes you the owner of the all the files, but the group is set to www-data. Next, you need to make sure that all the files going forward are owned by the www-data group.

The following commands add your user to the www-data group, ensure that all files created in the folder going forward are owned by the www-data group and that others cannot read write or execute them.

sudo usermod -a -G www-data $USER
sudo chmod g+s .
sudo chmod o-rwx .

Finally, per Laravel’s requirements, you can set write permissions to the folders that the app uses for cache and storage.

sudo chmod -R ug+rwx storage bootstrap/cache

Configure MySQL Database

When we installed all our packages, we also installed MySQL server and PDO, which allows PHP to communicate with databases. In order to create a table and user for our app, we will have to use the MySQL CLI. In Ubuntu, root account is locked, but it’s also the only user with privileges to make any changes, so you’ll have to use sudo:

sudo mysql

This will drop you into the MySQL CLI with privileges so you can create a database for your app and a user to access it. In the following commands, replace app_user and db_name with your app user name and database name, and a-secure-password with your DB user’s password. These should be the same values as referenced in your .env file for your Laravel project.

CREATE USER 'app_user'@'%' IDENTIFIED BY 'a-secure-password';
CREATE DATABASE db_name;
GRANT ALL PRIVILEGES ON db_name.* TO 'app_user'@'%';

At this point, you can migrate your database and run your seeders! You’ve done it. If your app continues to give you issues about permissions, you may need to assign ownership of all the files to your www-data user in order to serve them.

sudo chown -R www-data:www-data .

Tell the hardest truth first

The television show Twin Peaks was formative for me. When I was very young and it was on TV, I was way too young to be interested in grown-ups talking on TV. Then my aunt who provided me with my first computer and my first joint showed me a single wild and twisted scene from the show that she had recorded wherein a young woman tells the story of a dream-premonition that comes true. Later in life, after watching The Elephant Man and Eraserhead, a friend of mine was like “Have you ever seen the TV show Twin Peaks directed and written by David Lynch?” As the pilot started I felt strangely connected to the show and it’s characters. By the time Maddy arrives in Twin peaks, I’m Rick Dalton:

Long story short, I believe that all knowledge one needs to recover from the trauma of discovering that you are the beneficiary of privilege and not the product of excellence is buried within Twin Peaks and one must simply unpack it. At the beginning of the second season, a freshly hair-plugged Billy Zane playing John Justice Wheeler arrives to offset the maladies of the Horne family. He’s a known good-deed-doer who hasn’t been influenced by the supernatural evil of Twin Peaks. Ben Horne, a wealthy businessman and patriarch, asks him what the secret to being a good person is and his response is the single greatest piece of advice I’ve ever heard:

“Always tell the hardest truth first”.

When this advice found me, I was growing into a person who stands by their morals and isn’t flustered by unqualified influences. In the spirit of telling the hardest truths first, I will admit that I spent all week last week struggling to move my project forward. I am working on a fairly generic social media app that when finished should show off that I kind of vaguely know what I’m doing when it comes to being broadly useful at a company that wants to host a web app or website. In order to really send that point home, I think I’m gonna have to host this thing somewhere. When I was in a position that saw me hiring developers, I certainly never would have run code presented as part of an employment application, but I would look at code while playing with a deployed project, so that’s what I intend to do. There are several options for hosting a Laravel app but none of fulfill me, so I’ve decided that I’m going to use Ansible to write a configuration playbook and a deploy playbook so I can launch my app on a VPS…or I may opt to host it locally on my network and use a Cloudflare Tunnel to serve it publically, which would allow for a quick and clean teardown after I’ve given up caring. Either way, I started the week with a plan, and by day two I already needed to have a hard talk with myself: I don’t know how to set up a system to run Laravel; my SysOps skills are limited to the specific features that I had built in the past and I had never built a Laravel environment from scratch.

So I bailed on Ansible and instead decided that I would just document my experience of getting a Laravel environment up and running on a local VM. This is where I should have been focusing my attention since day one, but now I feel behind and unproductive. With a little knowledge and lot of documentation I got my laravel app installed on my local VM, I installed a database for persistence, I configured a few performance tools, and then I set about getting websockets configured. I just wrote an article on the topic so it would be a glaring omission. That’s when I hit a second major wall. My initial development configuration was fairly easy to set up, but now that I had moved on to a new context, it felt like nothing was working. I had read tons of articles, documents, and tutorials and it seemed like I had everything configured correctly. For a moment, I considered just throwing in the towel and that perhaps I could use my development set up to deploy my app since it seemed to magically solve my problem. And that was just the sour thought that led me to solving my problem.

My development server used a script to bring up the two services needed to run the websocket feature. In my production server, both services were running, but I couldn’t remember which I had started first, so in order to make sure that my development and production environments mirrored each other, I tested the theory by turning off, then turning on both servers in the order they were booting in the script. Maddeningly, this fixed the problem. It’s great to see it working. In fact, I love that it was a configuration issue, but I spent all my spare time for three days thinking about it, and the answer was something that I had already identified as something that would have to be completed before I could deploy to production. So here’s the hard truth: I got distracted by a shiny object, and then let failure stop progress. I need to be better about compartmentalizing problems and working more fastidiously. I need to work from a “known” towards an “unknown”, and don’t start my journeys in the dark.

I’ve been working with or on admin software as a part of my job for twenty years. I do not need to be told to reboot it. If I’m asking for help, I’ve either restarted the process, restarted the service, or rebooted the machine. This whole week was a lesson: trust yourself, follow your own rules, don’t be fooled by misdirection. Since I put the stick in my own spokes, I guess the rule that I should consider going forward would be: “If you can’t find the issue in the code, check the mirror.”

Using Laravel Queues, Jobs, and Broadcasting Events.

What is a queue and why are they useful

PHP is single threaded, so concurrency is managed in the software layer. Using Laravel’s queue, developers have a first class abstraction for asynchronus behavior. The queue helps the running application to prioritize and execute code asynchronously. One of the nice things about Laravel’s implementation of the queue is the ability to refactor a performance bottle neck into a queued job.

Imagine your working on a social media application which has Communities, and Users. Users can add Conversations to a Community. Users can then add a Comment to a Conversation or another Comment by filling out a small form built with Livewire and Volt.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
...
new class extends Component {
#[Validate('required')]
public string $body;

public Conversation $conversation;
public $root;

public function submit()
{
$this->validate();
Comment::create([
'conversation_id' => $this->conversation->id,
'user_id' => Auth::user()->id,
'commentable_id' => $this->root->id,
'commentable_type' => $this->root::class,
'body' => $this->body,
]);
$this->dispatch('comment-created', rootId: $this->root->id);
}
}; ?>

<form wire:submit="submit" class="space-y-6">
<div>
<x-input-label>Add Comment</x-input-label>
<x-text-input class="block w-full" wire:model="body" />
<x-input-error :messages="$errors->get('body')"/>
</div>
<x-primary-button>Submit</x-primary-button>
</form>

Let’s say this results in a performance bottleneck in our application. We can extract the behavior from the front end and use a queued job triggered from the frontend, and allow the application to insert the new comment asynchronously.

Since a comment can be left on a conversation or a comment, I’m using Laravel’s polymorphic relationships to pass the $root of the comment into the constructor.

Jobs can be created using an artisan command, which I recommend: php artisan make:job. You can pass a name in, or artisan will ask you for one. In this case, I named it InsertComment, and extracted the behavior from the ui component.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
...
class InsertComment implements ShouldQueue
{
use Queueable;

/**
* Create a new job instance.
*/
public function __construct(
public Conversation $conversation,
public User $user,
public Model $root,
public string $body,
)
{}

/**
* Execute the job.
*/
public function handle(): void
{
Comment::create([
'conversation_id' => $this->conversation->id,
'user_id' => $this->user->id,
'commentable_id' => $this->root->id,
'commentable_type' => $this->root::class,
'body' => $this->body,
]);
}
}

…and then we can update our form component to use the new Job by firing the ::dispatch() static method, passing in the parameters for the constructor. The comment root is enumerated into it’s id and class, so it doesn’t need to be typed specifically, but this gives future developers some guide as to what a “root” may be, in context.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
...
new class extends Component {
#[Validate('required')]
public string $body;

public Conversation $conversation;
public $root;

public function submit()
{
$this->validate();
InsertComment::dispatch(
$this->conversation,
Auth::user(),
$this->root,
$this->body,
);
// $this->dispatch('comment-created', rootId: $this->root->id);
}
}; ?>

<form wire:submit="submit" class="space-y-6">
<div>
<x-input-label>Add Comment</x-input-label>
<x-text-input class="block w-full" wire:model="body" />
<x-input-error :messages="$errors->get('body')"/>
</div>
<x-primary-button>Submit</x-primary-button>
</form>

As you can see, I’ve commented out the code that would previously dispatch a browser event after a comment was created. In the old code, that was fine. Single-threaded-PHP would interpret the code line-by-line and by the time the browser event was firing, we could be sure a new comment existed in the DB, but now we cannot be so sure. All things considered, the application may prioritize the generation of the browser event over the insertion of the comment. So the next step is to broadcast the event to our clients. While it might seem like the job is a perfectly good place to broadcast from, one of the other superpowers of jobs on the queue is they are isolated to maximize performance. The end result is that any data that you didn’t inject into the job’s constructor will not be available. The queue has access to the application in memory, but likely won’t be able to track any changes in the app state.

Broadcasting server events to a Livewire component

Laravel’s broadcasting capabilities, as of Laravel 11, are no longer configured by default. Websockets and RTC are managed by a separate server alongside your web server, and that server can either be managed by you or you can use a 3rd party provider to serve your websocket connections. If you already have websockets and broadcasting configured on your app, you can move on, but if not you may want to review Laravel’s docs on broadcasting. If you are starting from scratch, I would recommend using Laravel Reverb.

Broadcasting is related to the queue, as the queue is tasked with prioritizing the events you are broadcasting, but a broadcast event can be received by a user client, and then the user can request a fresh set of data and a new UI, which is how they differ from a job on the queue. Similar to a job, an event can be set up using an artisan command: php artisan make:event.

After creating the event, I set it up as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
...
class CommentCreated implements ShouldBroadcast
{
use Dispatchable, InteractsWithSockets, SerializesModels;

/**
* Create a new event instance.
*/
public function __construct(
public string $rootId,
)
{}

/**
* Get the channels the event should broadcast on.
*
* @return \Illuminate\Broadcasting\Channel
*/
public function broadcastOn(): Channel
{
return
new Channel('comment.' . $this->rootId);
}
}

I made two changes to the class as created by artisan.

  1. I had the class implement ShouldBroadcast
  2. I changed the broadcastOn method to return a single channel instead of an array of channels, with the default being private.

For a chat or other peer-to-peer style socket communication, you will want to make sure that the correct client is being communicated with and that they are authorized to be on the channel, but for updating a web page after an insert, using a public channel allows comments created by one user to show up in real-ish time for a second user viewing the same page.

I then modified the handle method on the job to queue the event after the comment is inserted:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
...
use App\Events\CommentCreated;
...
class InsertComment implements ShouldQueue
{
use Queueable;

/**
* Create a new job instance.
*/
public function __construct(
public Conversation $conversation,
public User $user,
public Model $root,
public string $body,
)
{}

/**
* Execute the job.
*/
public function handle(): void
{
Comment::create([
'conversation_id' => $this->conversation->id,
'user_id' => $this->user->id,
'commentable_id' => $this->root->id,
'commentable_type' => $this->root::class,
'body' => $this->body,
]);
CommentCreated::dispatch($this->root->id);
}
}

Your app needs routes set up for broadcasting, so when you configure your app for broadcasting, a channels.php file is added to your routes directory.

1
2
3
4
5
6
7
8
9
use Illuminate\Support\Facades\Broadcast;

Broadcast::channel('App.Models.User.{id}', function ($user, $id) {
return (int) $user->id === (int) $id;
});

Broadcast::channel('comment.{rootId}', function ($rootId) {
return true;
});

The route I added gets a route name as the first parameter. This will be used later to identify which element may need to be updated. The method passed in as the second parameter should return true or false to determine if the client is authorized. Similar to a job, any data required to do this would have to be passed in, because extracting data from the application in memory is not trustworthy “on the queue”. Now we can go back and modify our Livewire components to detect the event and update the component. Here’s the view that was previously detecting the browser event to display a new comment. The form we looked at previously is pulled in on line 49. The method to handle the event is on line 7.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
...
new #[Layout('layouts.app')] class extends Component {
public Conversation $conversation;
public bool $showRootComment = false;

#[On('comment-created')]
public function commentCreated($rootId) {
if ($this->conversation->id === $rootId) {
$this->reset(['showRootComment']);
$this->conversation->refresh();
}
}

public function mount(Community $community, Contribution $contribution) {
$this->conversation = Conversation::where('community_id', '=', $community->id)
->where('contribution_id', '=', $contribution->id)
->sole();
}

public function with() {
return [
'comments' => $this->conversation->comments()->paginate(10),
];
}

}; ?>

<div>
<x-header>
<div class="flex items-center justify-between">
<x-h1>
{{ $conversation->contribution->name }}
</x-h1>
<div class="flex flex-col space-y-1">
@if($conversation->community->userIsSubscribed(Auth::user()))
<x-primary-button wire:click="$toggle('showRootComment')">
Comment
</x-primary-button>
<livewire:components.turn.toggle :root="$conversation" key="vote-{{$conversation->id}}" />
@endif
<x-secondary-button-link :href="$conversation->contribution->url" target="_BLANK">
Visit
</x-secondary-button-link>
</div>
</div>
</x-header>
@if($showRootComment)
<x-content-card>
<livewire:components.comment.form :conversation="$conversation" :root="$conversation" :key="$conversation->id" />
</x-content-card>
@endif
@if($comments->count())
<x-content-card>
<x-h2>The Conversation</x-h2>
<div class="space-y-6">
@foreach($comments as $comment)
<livewire:components.comment.card
:comment="$comment"
:conversation="$conversation"
:root="$comment"
:key="$comment->id"/>

@endforeach
</div>
{{ $comments->links() }}
</x-content-card>
@endif
</div>

Under the hood of Reverb, Laravel is using a package called echo to broadcast to the client browsers. Livewire can detect these events and react to them just like any other browser event, so you can modify the above code as follows to have it track the browser event that is generated when the server event is broadcast:

2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
...
new #[Layout('layouts.app')] class extends Component {
public Conversation $conversation;
public bool $showRootComment = false;

public function getListeners() {
return [
"echo:comment.{$this->conversation->id},CommentCreated" => 'commentCreated'
];
}

public function commentCreated() {
$this->reset(['showRootComment']);
$this->conversation->refresh();
}
...

So this closes the loop: we queued a job, triggered by a user, when the job finished, it queued an event. The event was broadcast active sessions, and the component refreshes it’s state, although to call it completely finished you will probably want to show some kind of loading state in your UI when kicking off the job that can be cleared when the event fires and is broadcast.

PHP-in-Ruts

I have spent the last four years of my life as a professional PHP developer working on a Laravel web app for a final mile logistics company. So a lot of what I did was to build specific web pages to support normal business operations like creating, retrieving, updating, and deleting orders, managing customers, managing user rights and permissions, and generating invoices and payments.

For the most part, our database schema was where complexity was housed, and our code was about as transparent as it could be using PHP. My mentor, also my boss, had a lot of experience working on a large eCommerce web app built in Ruby-on-Rails, which was at one time the new and hot framework in web development. As we worked our way up from Laravel 5 to 7 and eventually 10, we started implementing some of the more modern syntax and niceties from these versions. The log-jam started to clear and features really started to fly! So obviously the company downsized it’s development team by half and the last bit of code in my portfolio happens to be from when I was in boot camp, so here we go lets build a Reddit clone with Laravel and talk about how it’s really slick now.

Without burying the lead, I’ve got a good jump on this project, and I’ve got a few things to say about Laravel 11, Livewire and Volt.

First, Volt: If you look in the controllers directory, you’ll see there are none. They are not necessary when you’re using a Volt route and a full page Volt component. Looking more closely, you can see that route model binding works the same way it does with existing Laravel routes. The resulting models can be type-hinted in the mount method on the Volt component. One of the really nice improvements added by the Volt package is the ability to have the logic and markup in the same file.

While a lot of the credit I’m giving here is handed out to the new bits that have become easier to implement in later versions of Laravel, the use of Blade syntax and modular Blade components is really the star that allows the Volt components to be oh-so-readable. I assume had I written out all the Tailwind classes in the component I linked above, it would probably be a nightmare to look at. While I’ll never be a designer, my time at a marketing firm did help me establish a few good habits when it comes to separation of code from markup, the separation of markup from styling, and how to build reusable UI atoms.

I’ve pushed fewer than 100 commits to this project as of right now, but I have accomplished the following:

  • Implement boilerplate Auth
  • Create basic social media models:
    • Conversation: a user creates these by submitting a URL to a community
    • Community: a repository for conversations, users can subscribe by ‘claiming’ the community.
    • Users can comment in a conversation.
    • Users can comment on comments infinitely.
    • Users can vote on comments and conversations.
  • Server side jobs: voting is queued
  • WebRTC / websockets: new votes are broadcast and update when a voting job is completed

As always, testing in Laravel is incredibly easy. Without controllers, it seems much easier to figure out what and how to test my models. Coverage is currently around 85%. The issue in coverage emanates from the parts of the Auth system I’m not using. Livewire really does allow you to think of your web application as a set of views that are intrinsically bound to the models they are hydrated by.

While I’m sure Alpine.js is a fine lightweight JS library, but here’s what I love about modern Laravel:

  • Don’t write controllers
  • Don’t write JavaScript
  • Bring unit-testable model behaviors directly into the view

Real Talk

LLMs will eventually, some day in the future, be as smart as humanity was at some point in the past.

As of the summer of 2024, the pace of Large Language Model development has slowed. The concept of the eventual emergence of artificial general intelligence is waning in both its scale and popularity. Which means that it’s probably just about time for AI’s “killer app” to emerge.

LLMs are mostly a tool, by now. People smarter than I are starting to apply these tools to actual problems they might be good for, so for the rest of us, here’s a quick break-down of how these things work.

LLMs are born when fed a huge set of text data. For the most part, the data sets are big outdated portions of the internet with some of the garbage and useless nonsense scooped out. The “black box” is the analysis of the text which figures out the mathematical relationships between the words.

Afterwards, the models are “fine tuned” using sets of inputs in the form of a question and a known good answer. The “black box” is then told to use those question-answer pairs to refine its output to create answers that align with the known good questions and answers.

If the question is about something that doesn’t exist in the initial data sets, most of the big LLMs are able to acquire new text data. For example, if ChatGPT doesn’t know what you’re talking about, it’s allowed to use Bing Search to find new documents on the internet to ingest. This is called “Retrieval Augmented Generation” or RAG.

So that’s why I’m not scared, because LLMs will never (seemingly ever) know anything that someone hasn’t said before. LLMs will eventually, some day in the future, be as smart as humanity was at some point in the past. Even if LLMs are ingesting text generated by LLMs, if you dig deeply enough into the trail of documents parsed by any given LLM eventually you’ll find my angsty LiveJournal posts from 2001, or an essay about Roy Orbison completely wrapped in cling film.

And for what it’s worth, it’s only gonna take a few more of these before the industry backs away from these tools significantly.

It's Radiohead

So I listened to Wall of Eyes by Radiohead. I guess it was some kind of trick to confuse the press? They released it under some other name. Anyway, it’s a journey of how one would write a song if you had Thom Yorke’s voice at your disposal. It was really good, and it ended with a lilting Thom Yorke piano ballad. No notes, Radiohead.

J Mascis new album What Do We Do Now was very comfortable for a sometimes Dinosaur Jr lover. It had what you’ve come to appreciate from J’s solo work: dynamics. It lacks the obfuscation and complexity of a Dinosaur Jr. album, but it was very easy to listen to.

Released this week, every week

Every week for the past several years I have listened to a new album. To me ‘new’ means an album which was released within 7 days of me listening to it. I have learned about a lot of stuff that I really like by doing this. I assure you that the practice started in a very grass-roots fashion. Over time, it became a challenge to myself to see how long I could keep it going on, and now it’s just something that I do. Things like this have no right to get boring, though. So I’m going to try and take some time to document some of my experiences this year.

Zwierzakom by Cisnienie is an epic post rock experience. These extra long textured minimalist songs are very long. The tracks often build from a pulse and crescendo in the way that post rock does. The first tune is 20 minutes, and at the end the apparent crowd noise informs us that this is also a bunch of individual live takes. It’s impressive, but I was gasping for air by the fourth tune…which is 55 minutes in. Released 30 Dec 2023. Listened 3 Jan 2024

Letter to Self by SPRINTS gets the replay award for this post. This album has something that I can’t put my finger on. The singer is super emotive and seems to push hard on all the lyrics in that typical aggressive Irish way that seems to be very endearing to us Yanks. It’s like if Lemmy came back from the dead and the first thing he said was “I wanna hear something with guitars to make everyone forget Sleigh Bells ever existed” and SPRINTS was right there like “We’re doing it.” Released 5 Jan 2024, listened 8 Jan 2024.

If this is AI, I'm both the dwarf AND the pliers.

I signed up for ChatGPT and had the following converstaion:

Alex Szeliga: Why does the porridge bird lay it’s egg in the air?

ChatGPT: I’m sorry, but the porridge bird is a fictional creature that exists only in a nursery rhyme. There is no scientific explanation for why it lays its egg in the air, as it is purely a fantastical concept created for the purpose of entertainment or education. The rhyme may have been designed to help children develop their imagination or language skills.

Alex Szeliga: Why are you sorry?

ChatGPT: As an AI language model, I don’t have emotions, so I don’t actually feel sorry. However, I used that phrase as a way to acknowledge that I was unable to answer your initial question about the porridge bird laying its egg in the air. My apologies if my previous response was unclear.

Alex Szeliga: Are you familiar with The Firesign Theater?

ChatGPT: Yes, I’m familiar with The Firesign Theatre. It was an American comedy group that was popular in the late 1960s and early 1970s. The group consisted of four members: Phil Austin, Peter Bergman, David Ossman, and Philip Proctor. They were known for their innovative and surreal comedy style, which often involved intricate wordplay, social commentary, and satire. Some of their most well-known works include the albums “Don’t Crush That Dwarf, Hand Me the Pliers” and “I Think We’re All Bozos on This Bus,” which are considered classics of the genre.

Alex Szeliga: They wrote a sketch called “The Breaking of the President” where a natural language computing model was “broken” by asking it about the porridge bird laying it’s egg in the air. In the skit, the language model responded with a lot of gibberish, but eventually settled on one answer: No.

ChatGPT: That’s an interesting piece of trivia! It’s not surprising that a natural language computing model from that time period would struggle to understand and respond coherently to a question about a fictitious creature like the porridge bird laying an egg in the air. However, it’s worth noting that with advancements in natural language processing and machine learning techniques in recent years, modern language models are now able to understand and generate more complex language and even generate creative and imaginative responses.

…Did ChatGPT just jive me?

“…because if we’re lucky tomorrow, we won’t have to answer questions like yours anymore”

A young talented artist needs to be protected.

I was raised in a family of musicians and music aficionados. No instrument or music was out of bounds, but no one was allowed to like music just to like it. The lowest form of music was music that was meant to be ignored like in an elevator or the grocery store. Our troupe were bards, not scholars, so extra emphasis was laid on untrained and naturally gifted players.

As a boy with a drumset in the basement, but not yet any actual drummers in the family, I took close notice to the few drummers mentioned in this way: Ringo, Levon, but especially Ginger. Two refrains: He was a jazz musician in London before becoming a global mega-star of rock drumming, and “his drum solos were so precisely tailored to the songs that you could hum the tune and the band would just pick it back up; It was an actual drum solo in the composition, not just a drummer flailing about wildly.”

As one does, I trusted this as gospel. It came directly from the people around me I trusted with my development as an artist. Then I started to learn about Mr. Baker. I watched the solos and the tape available. I listened to the recordings, learned about the man, his legacy and thought to myself, what is different about Ginger Baker and John Bonham? Why is it that more people will remember White Room than will know who John Bonham is, but more drummers will remember John Bonham than Ginger Baker?

Fucking ego. Both men had terrible ego problems, and their unwillingness to put it down behind the kit is what made them interesting to watch, but we’re done with watching our artists die before us for pay. We want them to live, and that has created JD Beck.

JD Beck is an American drummer, producer and songwriter. If you don’t know, listen to this then this, then you’ll know. They are currently promoting their first album and I’m very excited to watch their career and see what kind of new and interesting things they bring to the drums. Anderson .Paak; you’re on notice. Don’t let this one do anything dumb. I wanna see what Jaco would be like at 80.

JD is special because the way that JD plays the drums is respectful of how humans experience the drums. As much as Ginger and Bonzo and the like would have you believe that you need resonance and a broad range of sonic options to make the drums fit into a broader musical composition, JD has discovered that it’s not true. In much the same way that Jaco Pastoreus discovered a relationship with how people want to experience his instrument in their music, JD has taken an example from a generation of musicians who are digital natives. On top of all of that, the average quality of speakers have never been shittier. While Ginger and Bonzo will tell you 4 toms minimum, the average person can’t tell the difference between a drum set with 2 and a drum set with 5. While this is fairly unscientific, my wife can’t actually hear the chord changes in most songs with distorted guitars, but she’s not alone. While musicians and auteurs crave fidelity and the extended information it provides, the amount of auditory information coming off a drum-set is a lot. Need more proof? Turn on the radio to a station that plays pop music. Listen for a few minutes, then SING the drum part. Regardless of how you go about it or what song you hear, it will sound something like this. This is how tabla drum music is shared between players. Even a simple clay drum like the tabla, in context, presents so much information to the listener that the actual performance is obscured, so the players use a simplified language of mouth sounds that sound like drums. Mouth drums. The mouth drum set is always rudimentary. There are no extra tom-tom drums or cymbals on the mouth drum set. Similarly, there are no extra drums or cymbals on JD Beck’s drum set. The drums sound short and the cymbals sound high and scratchy. The floor tom and the bass drum make essentially the same noise, but the listener pays no penalty. In fact, the listener isn’t forced to think about that decision at all.

One of the things that makes JD Beck’s actual playing so special is that when they perform, JD makes the crowd feel the beauty of metric modulation, rudimental drumming, jazz comping, and a couple hundred years of tradition while completely missing the off-ramp to the drummer’s circle-jerk. By doing so, JD has created a path to feeling the way it feels to play the drums from your soul. John Bonham just wanted to share that with his family, and this profession tore him away from that. Ginger Baker is a sociopath who wanted to be a famous drummer. Of course they both loved the off-ramp to the drummer’s circle jerk. To be fair, JD Beck had a whole generation of hip-hop, synthpop, and electronic music producers who had already, by necessity, stripped most of the information out of the drumset, JD is just the bolded and brightest of a generation of drummers who will likely do the same.

Here’s what I mean, when I say JD is curating a selection of complicated musical conversations and making you feel them. In this exercise, you’ll listen to a short part of a song twice. The first time, you just listen to it. Just enjoy it, but bob your head or tap your foot.

JD Beck @ Zildjian Live: give it about 30 seconds to a minute.

The second exercise is to try and do that thing we do with all pop music: try and divide it into 4. Start back around @4:20 in the song, and whenever you feel comfortable start counting out 1, 2, 3, 4. You will immediately get that “rubbing my belly and patting my head” feeling. The music follows the familiar “4/4” pattern. You can stop and go back to listening for entertainment now. The tune is on their new album as a track called 2 SHRiMPZ featuring Mac Demarco.

Anyway, thank you for coming to my TED Talk. Musicians who play drums need to be protected, and not because they are in danger of becoming machines, but because they are in danger of being just a drummer.

The End of the (Third) Party

It’s frustratingly stupidly difficult to integrate third party code into an existing project.

When I’m looking for solutions to problems I’m being paid to solve, I can usually think about it from at least two perspectives outside of my own and still make progress. When I try and defend more than three positions, it becomes increasingly difficult to believe that anything shared across all isn’t just an aftifact of my bias.

In my day-to-day life as a developer, a feature usually requires me to think of two abstract characters: “the business” and “the next developer”. I can have a conversation with these two entities in my head and usually come up with a solution to any problem that I’m being paid to solve which I can feel confident will be ready to present to anyone on my team or within my organization.

The introduction of third party code is like the introduction of a completely new voice to the conversation, and it’s the voice of someone who doesn’t even go here
She doesn't even go here meme

When you consider adding third party code to your codebase, you’re asking for another voice in the room, but it’s not a voice you should expect to harmonize perfectly with your existing trio. You should expect this collaboration to be difficult. That being the case, going in prepared is the best way to be successful. The promise of third party code is always the moon and the stars…which are actually free and not special.

The advice that I can provide to people who are considering or are about to add third party code is simple: make sure you’re implementation requires third party code, know the ecosystem beyond the most popular libraries, implement based on your needs.

If possible, use only “first party” third party code. If you are implementing Firebase Auth on your NextJS front end, you may not need to install anything beyond the Firebase NPM package, or you may want to install next-firebase-auth and you might want to include react-firebaseui to eliminate as much development time as possible. All of these choices are paths that can result in success, but each choice will result in a convuluted package manifest and pinned requirements. The fewer moving pieces you add, the more time you’ll have to spend to implement your solution. The more you add, in this case, nets you a vastly simpler developer experience to get started, but customization will come at the cost of a simple codebase.

…but then again…The nuclear option would be if you can “drink from the firehose”. Firebase has a public API. There’s no reason I have to use any extra code, I should be able to do this with “fetch” alone…

Do you hear yourself? You sound crazy.