Setting up a GPG Key with git to sign your commits

Signing your git commits with GPG is really easy to set up and I’m always surprised by how many developers I meet that don’t do this.

Of course it’s not required to push commits and has no baring on quality of code. But that green verified message next to your commits does feel good.

Essentially there are three parts to this:

  1. Create your GPG key
  2. Tell git to use your GPG key to sign your commits
  3. Upload the public part of your GPG key to Gitlab / Github / etc

Creating the GPG key if needed

gpg --full-generate-key

In the interactive guide, I choose:

  1. (1) RSA and RSA (default)
  2. 4096 bits long
  3. Does not expire
  4. Fill in Name, Email, Comment and Confirm.
  5. Enter passphrase when prompted.

Getting the Key ID

This will list all of your keys:

gpg --list-secret-keys --keyid-format=long

Example of the output:

sec   rsa4096/THIS0IS0YOUR0KEY0ID 2020-12-25 [SC]
      KGHJ64GHG6HJGH5J4G6H5465HJGHJGHJG56HJ5GY
uid                 [ultimate] Bob GPG Key<mail@your-domain.co.uk>

In that example, the key id that you would need next is “THIS0IS0YOUR0KEY0ID” from the first line, after the forward slash.

Tell your local git about the signing key

To set the gpg key as the signing key for all of your git projects, run the following global git command:

git config --global user.signingkey THIS0IS0YOUR0KEY0ID

If you want to do it on a repository by repository basis, you can run it from within each project, and omit the --global flag:

git config user.signingkey THIS0IS0YOUR0KEY0ID

Signing your commits

You can either set commit signing to true for all projects as the default, or by a repo by repo basis.

# global
git config --global commit.gpgsign true

# local
git config commit.gpgsign true

If you wanted to, you could even decide to sign commits per each commit, by not setting it as a config setting, but passing a flag on every commit:

git commit -S -m "My signed commit message"

Adding your public key to gitlab / github / wherever

Firstly export the public part of your key using your key id. Again, using the example key id from above:

# Show your public key in terminal
gpg --armor --export THIS0IS0YOUR0KEY0ID

# Copy straight to your system clipboard using "xclip"
gpg --armor --export THIS0IS0YOUR0KEY0ID | xclip -sel clipboard

This will spit out a large key text block begining and ending with comments. Copy all of the text that it gives you and paste it into the gpg textbox in your git forge of choice – gitlab / github / gitea / etc.

How I use vimwiki in neovim

This post is currently in-progress, and is more of a brain-dump right now. But I like to share as often as I can otherwise I’d never share anything 🙂

Please view the official Vimwiki Github repository for up-to-date details of Vimwiki usage and installation. This page just documents my own processes at the time.

Installation

Add the following to plugins.lua

use "vimwiki/vimwiki"

Run the following two commands separately in the neovim command line:

:PackerSync
:PackerInstall

Close and re-open Neovim.

How I configure Vimwiki

I have 2 separate wikis set up in my Neovim.

One for my personal homepage and one for my commonplace site.

I set these up by adding the following in my dotfiles, at the following position: $NEOVIM_CONFIG_ROOT/after/plugin/vimwiki.lua. So for me that would be ~/.config/nvim/after/plugin/vimwiki.lua.

You could also put this command inside the config function in your plugins.lua file, where you require the vimwiki plugin. I just tend to put all my plugin-specific settings in their own “after/plugin” files for organisation.

vim.cmd([[
  let wiki_1 = {}
  let wiki_1.path = '~/vimwiki/website/'
  let wiki_1.html_template = '~/vimwiki/website_html/'
  let wiki_2 = {}
  let wiki_2.path = '~/vimwiki/commonplace/'
  let wiki_2.html_template = '~/vimwiki/commonplace_html/'
  let g:vimwiki_list = [wiki_1, wiki_2]
  call vimwiki#vars#init()
]])

The path keys tell vimwiki where to plave the root index.wiki file for each wiki you configure.

The html_template keys tell vimwiki where to place the compiled html files (when running the :VimwikiAll2HTML command).

I keep them separate as I am deploying them to separate domains on my server.

When I want to open and edit my website wiki, I enter 1<leader>ww.

When I want to open and edit my commonplace wiki, I enter 2<leader>ww.

Pressing those key bindings for the first time will ask you if you want the directories creating.

How I use vimwiki

At the moment, my usage is standard to what is described in the Github repository linked at the top of this page.

When I develop any custom workflows I’ll add them here.

Deployment

Setting up a server to deploy to is outside the scope of this post, but hope to write up a quick guide soon.

I run the following command from within vim on one of my wiki index pages, to export that entire wiki to html files:

:VimwikiAll2HTML

I then SCP the compiled HTML files to my server. Here is an example scp command that you can modify with your own paths:

scp -r ~/vimwiki/website_html/* your_user@your-domain.test:/var/www/website/public_html

For the best deployment experience, I recommend setting up ssh key authentication to your server.

For bonus points I also add a bash / zsh alias to wrap that scp command.

General plugins I use in Neovim

I define a “general plugin” as a plugin that I use regardless of the filetype I’m editing.

These will add extra functionality for enhancing my Neovim experience.


I use Which-key for displaying keybindings as I type them. For example if I press my <leader> key and wait a few milliseconds, it will display all keybindings I have set that begin with my <leader> key.

It will also display any marks and registers I have set, when only pressing ' or @ respectively.

use "folke/which-key.nvim"

Vim-commentary makes it super easy to comment out lines in files using vim motions. So in normal mode you can enter gcc to comment out the current line; or 5gcc to comment out the next 5 lines.

You can also make a visual selection and enter gc to comment out that selected block.

use "tpope/vim-commentary"

Vim-surround provides me with an extra set of abilities on text objects. It lets me add, remove and change surrounding elements.

For example I can place my cursor over a word and enter ysiw" to surround that word with double quotes.

Or I can make a visual selection and press S" to surround that selection with double quotes.

use "tpope/vim-surround"

Vim-unimpaired adds a bunch of extra mappings that tpope had in his own vimrc, which he extracted to a plugin.

They include mappings for the [ and ] keys for previous and next items. For example using [b and ]b moves backwards and forwards through your open buffers. Whilst [q and ]q will move you backwards and forwards respectively through your quickfist list items.

use "tpope/vim-unimpaired"

Passive plugins I use in Neovim

These plugins I use in Neovim are ones I consider “passive”. That is, they just sit there doing their thing in the background to enhance my development experience.

Generally they wont offer extra keybindings or commands I will use day to day.

You can view all the plugins I use in my plugins.lua file in my dotfiles.


Vim-lastplace will remember the last edit position of each file you’re working with and place your cursor there when re-entering.

use "farmergreg/vim-lastplace"

Nvim-autopairs will automatically add closing characters when opening a “pair”, such as {, [ and (. It will then place your cursor between the two.

use "windwp/nvim-autopairs"

Neoscroll makes scrolling smooth in neovim.

use "karb94/neoscroll.nvim"

Vim-pasta will super-charge your pasting in neovim to preserve indents when pasting contents in with “p” and “P“.

use({
  "sickill/vim-pasta",
  config = function()
    vim.g.pasta_disabled_filetypes = { 'fugitive' }
  end,
})

Here I am passing a config function to disable vim-pasta for “fugitive” filetypes. “Fugitive” is in reference to the vim-fugitive plugin that I will explain in another post.


Nvim-colorizer will highlight any colour codes your write out.

use "norcalli/nvim-colorizer.lua"

How I use Neovim

I try to use Neovim for as much development-related work as possible.

This page serves as a point of reference for me, and other people interested, for what I use and how I use it.

Feedback is welcome and would love to know how you use Neovim too!

My complete Neovim configuration files can be found on Github.

  1. How I organise my Neovim configuration
  2. Passive plugins I use in Neovim
  3. General plugins I use in Neovim
  4. Development plugins I use in Neovim – coming soon
  5. Database client in Neovim (vim-dadbod and vim-dadbod-ui) – coming soon
  6. REST client in Neovim (vim-rest-client) – coming soon
  7. Personal Wiki in Neovim (vim-wiki) – coming soon

Incredible shot from The Last of Us episode 6

Another incredible episode of The Last of Us.

The references were so great too. Shimmer; Dina in the background; farm with sheep.

Next week will be the episode with David and it’s gonna be so very dark and have one of the best episode endings so far I reckon.

Inventory app — saving inventory items.

This is the absolute bare bones minimum implementation for my inventory keeping: saving items to my inventory list.

Super simple, but meant only as an example of how I’d work when working on an API.

Here are the changes made to my Inventory Manager. Those changes include the test and logic for the initial index endpoint too. I may blog about that part in a separate post soon.

Writing the store test

One of Laravel’s many strengths is how well it is set up for testing and just how nice those tests can read. Especially now that I’ve started using Pest.

Here is the test I wrote for the store endpoint I was yet to write:

test('inventory items can be created', function () {
    $response = $this->postJson(route(name: 'inventory.store'), [
        'name' => 'My Special Item',
    ]);

    $response->assertStatus(201);

    $this->assertDatabaseHas(Inventory::class, [
        'name' => 'My Special Item',
    ]);
});

Firstly I post to an endpoint, that I am yet to create, with the most minimal payload I want: an item’s name:

$response = $this->postJson(route(name: 'inventory.store'), [
    'name' => 'My Special Item',
]);

Then I can check I have the correct status code: an HTTP Created 201 status:

$response->assertStatus(201);

Finally I check that the database table where I will be saving my inventory items has the item I have created in the test:

$this->assertDatabaseHas(Inventory::class, [
    'name' => 'My Special Item',
]);

The first argument to the assertDatabaseHas method is the model class, which Laravel will use to determine the name of the table for that model. Either by convention, or by the value you override it with on the model.

The second argument is an array that should match the table’s column name and value. Your model can have other columns and still pass. It will only validate that the keys and values you pass to it are correct; you don’t need to pass every column and value — that would become tedious.

Writing the store logic

There is a term I’ve heard in Test-driven development called “sliming it out”. If I remember correctly, this is when you let the test feedback errors dictate every single piece of code you add.

You wouldn’t add any code at all until a test basically told you too.

I wont lie – I actually love this idea, but it soon becomes tiresome. It’s great to do when you start out in TDD, in my opinion, but soon you’ll start seeing things you can add before running the test.

For example, you know you’ll need a database table and a model class, and most likely a Model Factory for upcoming tests, so you could run the artisan command to generate those straight away:

php artisan make:model -mf Inventory

# or with sail
./vendor/bin/sail artisan make:model -mf Inventory

I dont tend to generate my Controller classes with these, as I now use single-action controllers for personal projects.

Store Route

Within the routes/web.php file, I add the following:

use App\Http\Controllers\Inventory\StoreController;

Route::post('inventory', StoreController::class)->name('inventory.store');

Using a single-action class here to keep logic separated. Some would see this as over-engineering, especially if keeping controller code to a minimum anyway, but I like the separation.

Adding an explicit “name” to the endpoint, means I can just refer to it throughout the app with that name. Like in the test code above where I generate the endpoint with the “route” helper function:

route(name: 'inventory.store')

Store Controller

<?php

declare(strict_types = 1);

namespace App\Http\Controllers\Inventory;

use App\Http\Requests\InventoryStoreRequest;
use App\Models\Inventory;
use Illuminate\Contracts\Routing\ResponseFactory;
use Illuminate\Http\Response;

class StoreController
{
    public function __invoke(InventoryStoreRequest $request): Response|ResponseFactory
    {
        Inventory::create([
            'name' => $request->get(key: 'name'),
        ]);

        return response(content: 'Inventory item created', status: 201);
    }
}

Super straight forward at the moment. After receiving the request via the custom request class (code below), I just create an inventory item with the name on the request.

I then return a response with a message and an HTTP Created 201 status.

This code does assume that it was created fine so I might look at a better implementation of this down the line…

…but not before I have a test telling me it needs to change.

InventoryStoreRequest class

This is a standard generated request class with the following rules method:

/**
 * Get the validation rules that apply to the request.
 *
 * @return array<string, mixed>
 */
public function rules(): array
{
    return [
        'name' => 'required',
    ];
}

Again, nothing much to it. It makes sure that a name is required to be passed.

Its not saying anything about what that value could be. We could pass a date time or a mentally-long string.

I’ll fix that in a future post.

An extra test for the required name

In order to be “belt and braces”, I have also added a test that proves that we require a name to be passed. Pest makes this laughable simple:

test('inventory items require a name', function () {
    $this->postJson(route(name: 'inventory.store'))
        ->assertJsonValidationErrorFor('name');
});

This just performs a post request to the store endpoint, but passes no data. We then just chain the assertJsonValidationErrorFor method, giving it the parameter that should have caused the failed validation. In this case “name”.

As the validation becomes more sophisticated I will look at adding more of these tests, and even possibly running all “required” fields through the some test method with Pests data functionality. Essentially the same as how PHPUnit’s Data Providers work.

Useful Links

Complete changes in git for when I added the store and the index endpoints to my Inventory app.

Connecting to a VPN in Arch Linux with nmcli

nmcli is the command line tool for interacting with NetworkManager.

For work I sometimes need to connect to a vpn using an .ovpn (openvpn) file.

This method should work for other vpn types (I’ve only used openvpn)

Installing the tools

All three of the required programs are available via the official Arch repositories.

Importing the ovpn file into your Network Manager

Once you’ve got the openvpn file on your computer, you can import it into your Network Manager configuration with the following command:

# Replace the file path with your own correct one.
nmcli connection import type openvpn file /path/to/your-file.ovpn

You should see a message saying that the connection was succesfully added.

Activate the connection

Activating the connection will connect you to the VPN specified with that .ovpn file.

nmcli connection up your-file

If you need to provide a password to your vpn connection, you can add the --ask flag, which will make the connection up command ask you for a password:

nmcli connection up your-file --ask

Disconnect

To disconnect from the VPN, just run the down command as follows:

nmcli connection down you-file

Other Links:

Network Manager on the Arch Wiki.

Installing and setting up github cli

What is the github cli

The Github CLI tool is the official Github terminal tool for interacting with your github account, as well as any open source projects hosted on Github.

I’ve only just begun looking into it but am already trying to make it part of my personal development flow.

Installation

You can see the installation instructions here, or if you’re running on Arch Linux, just run this:

sudo pacman -S github-cli

Once installed, you should be able to run the following command and see the version you have installed:

gh --version

Authenticating

Before interacting with your github account, you will need to login via the cli tool.

Generate a Github Personal Access Token

Firstly, I generate a personal access token on the Github website. In my settings page I head to “Developer Settings” > “Personal Access Tokens” > “Tokens (classic)”.

I then create a new “classic” token (just my preference) and I select all permissions and give it an appropriate name.

Then I create it and keep the page open where it displays the access token. This is for pasting it into the terminal during the authentication flow next.

Go through the Github CLI authentication flow

Start the authentication flow by running the command:

gh auth login

The following highlights are the options I select when going through the login flow. Your needs may vary.

What account do you want to log into?
> Github.com
> Github Enterprise Server

What is your preferred protocol for Git operations?
> HTTPS
> SSH

Upload your SSH public key to your Github account?
> /path/to/.ssh/id_rsa.pub
> Skip

How would you like to authenticate Github CLI?
> Login with a web browser
> Paste an authentication token

I then paste in the access token from the still-open tokens page, and hit enter.

You should see it correctly authenticates you and displays who you are logged in as.

Check out the official documentation to see all of the available actions you can perform on your account.

Adding Laravel Jetstream to a fresh Laravel project

I only have this post here as there was a couple of extra steps I made after regular installation, which I wanted to keep a note of.

Here are the changes made to my Inventory Manager.

Follow the Jetstream Installation guide

Firstly I just follow the official installation guide.

When it came to running the Jetstream install command in the docs, this was the specific flavour I ran:

php artisan jetstream:install livewire --pest

This sets it up to use Livewire, as I wanted to learn that along the way, as well as setting up the Jetstream tests as Pest ones.

Again, I’m not too familiar with Pest (still loving phpunit) but thought it was worth learning.

Enable API functionality

I want to build my Inventory Manager as a separate API and front end, so I enabled the API functionality after install.

Enabling the built-in API functionality, which is Laravel Sanctum by the way, is as easy as uncommenting a line in your ./config/jetstream.php file:

'features' => [
    // Features::termsAndPrivacyPolicy(),
    // Features::profilePhotos(),
    Features::api(),
    // Features::teams(['invitations' => true]),
    Features::accountDeletion(),
],

The Features::api(), line should be commented out by default; just uncomment it and you’re good to go.

Setup Pest testing

The only thing that tripped me up was that I hadn’t previously setup pest, which was causing the Jetstream tests to fail.

So I ran the following command, which is modified for my using Laravel Sail, from the Pest Documentation:

./vendor/bin/sail artisan pest:install

I then also added the RefreshDatabase trait to my ./tests/TestCase.php file.

Then all of my tests pass.

That is Jetstream setup and ready to continue for me.

How I organize my Neovim configuration

The entry point for my Neovim Configuration is the init.lua file.

Init.lua

My entrypoint file simply requires three other files:

require 'user.plugins'
require 'user.options'
require 'user.keymaps'

The user.plugins file is where I’m using Packer to require plugins for my configuration. I will be writing other posts around some of the plugins I use soon.

The user.options file is where I set all of the Neovim settings. Things such as mapping my leader key and setting number of spaces per tab:

vim.g.mapleader = " "
vim.g.maplocalleader = " "

vim.opt.expandtab = true
vim.opt.shiftwidth = 4
vim.opt.tabstop = 4
vim.opt.softtabstop = 4

...etc...

Finally, the user.keymaps file is where I set any general keymaps that aren’t associated with any specific plugins. For example, here I am remapping the arrow keys to specific buffer-related actions:

-- Easier buffer navigation.
vim.keymap.set("n", "", ":bp", { noremap = true, silent = true })
vim.keymap.set("n", "", ":bn", { noremap = true, silent = true })
vim.keymap.set("n", "", ":bd", { noremap = true, silent = true })
vim.keymap.set("n", "", ":%bd", { noremap = true, silent = true })

In that example, the left and right keys navigate to previous and next buffers. The down key closes the current buffer and the up key is the nuclear button that closes all open buffers.

Plugin-specific setup and mappings

For any plugin-specific setup and mappings, I am using Neovim’s “after” directory.

Basically, for every plugin you install, you can add a lua file within a directory at ./after/plugin/ from the root of your Neovim configuration.

So for example, to add settings / mappings for the “vim-test” plugin, I have added a file at: ./after/plugin/vim-test.lua with the following contents:

vim.cmd([[
  let test#php#phpunit#executable = 'docker-compose exec -T laravel.test php artisan test'
  let test#php#phpunit#options = '--colors=always'
  let g:test#strategy = 'neovim'
  let test#neovim#term_position = "vert botright 85"
  let g:test#neovim#start_normal = 1
]])

vim.keymap.set('n', 'tn', ':TestNearest', { silent = false })
vim.keymap.set('n', 'tf', ':TestFile', { silent = false })
vim.keymap.set('n', 'ts', ':TestSuite', { silent = false })
vim.keymap.set('n', 'tl', ':TestLast', { silent = false })
vim.keymap.set('n', 'tv', ':TestVisit', { silent = false })

This means that these settings and bindings will only be registered after the vim-test plugin has been loaded.

I used to just have extra required files in my main init.lua file, but this feels so much more cleaner in my opinion.

Update: 9th February 2023 — when setting up Neovim on a fresh system, I notice that I get a bunch of errors from the after files as they are executing on boot, before I’ve actually installed the plugins. I will add protected calls to the plugins soon to mitigate these errors.