Apr 10, 2020 - Manage a blog with Gitlab


Manage a blog with Gitlab

Thought it’s actually time to write something about how this blog is built. All posts are written in Markdown language, which is then transformed by Jekyll into plain static HTML.

The Jekyll files and posts are all in a (private) Gitlab repository. Before I write a new entry I create a new branch, then commit the new post and changes to the branch. When I’m finished I create a ‘Merge request’ and merge it. The merge then triggers a pipeline job which builts the actual website using Jekyll. This is done by a .gitlab-ci.yml file in the root directory of the repository:

image: ruby:2.3

  JEKYLL_ENV: production

- bundle install

  stage: build
  - ./createTagPages.sh
  - bundle exec jekyll build -d _site
    - _site
  - master
    - 'curl https://floki.blog/<SOME_IDENTIFIER>/deploy/$CI_JOB_ID'

It’s probably time to update this file at some point…

Anyway, what it does is: It uses the ruby docker image (Jekyll is written in Ruby). Installs Jekyll and the dependencies (which are specified in a ‘Gemfile’, see at the bottom of the page). Then calls Jekyll to build the website in the _site directory. Ignore the ./createTagPages.sh for now, this only - surprise - updates the tags which you see on the left hand side of the page. When Jekyll’s done, the _site directory is used as ‘artifact’ of the build, i. e. a zip file which contains the built website is created. When the build is finished I just ‘ping’ the webserver hosting it on a specific URL passing on the ID of the build job.

A script on the webserver runs periodically and checks for this call in the nginx logs and if it’s found, it will get the artifact from gitlab and deploy the new website:


deployedId=`cat /var/www/version`

tmp=`cat /var/log/nginx/access.log | awk '/<SOME_IDENTIFIER>\/deploy/ {print $7}'`
if [[ -n "$jobId" && "$jobId" != "$deployedId" ]]
	cd /tmp
	curl -L --header "PRIVATE-TOKEN: <GITLAB_TOKEN>" https://gitlab.com/api/v4/projects/<PROJECT_ID>/jobs/$jobId/artifacts >> artifacts.zip
	unzip artifacts.zip

	cd _site
	chown -R www-data:www-data *
	cp -r * /var/www/html/

	cd ..
	rm -rf _site artifacts.zip

	echo $jobId > /var/www/version

	echo "$dt - Deployed $jobId" >> /var/log/blog.log

Finally the Gemfile for the Jekyll installation:

source "https://rubygems.org"

gem "jekyll", "~> 3.8.3"

group :jekyll_plugins do
  gem 'jekyll-paginate'
  gem 'jekyll-sitemap'
  gem 'kramdown'
  gem 'rouge'

Apr 8, 2020 - /var/backups/alternatives.tar.0



A few days ago my Raspberry Pi “died”, which was pretty annoying because I just wanted to watch a movie which is on the NFS drive (served by this Raspberry Pi)… Tried to boot it up again, no success, just hangs at some random boot state. So investigated the SD-card. Turned out that a file /var/backups/alternatives.tar.0 filled up all the disk space! WTF!? Usually root has a few percentage reserved disk space (by default 5%) to prevent the system to “die” if it runs out of space. But as the process creating this altneratives.tar.0 file was run by root, it just carried on writing to the disk until everything died, great.


# Mount second partition of the SD card 
# on another machine, e.g. /tmp/mnt

# Delete everything under /var/backups
rm -rf /tmp/mnt/var/backups/*

To prevent this from happening again:

# Delete dpkg cron job (which I suspect creates these files)
rm /tmp/mnt/etc/cron.daily/dpkg

Feb 20, 2020 - Artix Linux


Artix Linux

I’m pretty fed up with having to reinstall my deskop PC occassionally because my distribution moved to another major release version. And I’m pretty fed up with all the crappy quirks of systemd and how it’s spreading like an infection through the Linux system. That’s why I was looking for a distribution which tries to follow the KISS principle, the old school Linux principle that a tool does one job and does it well, and which has a rolling release system. So I ended up with Artix Linux, an Arch-based distribution, which you can have with openrc or runit init system. And I think it’s quite likely that’s gonna be my distribution of choice of many years to come.

But it will take me a while to get used it. I started with Suse, then had Fedora for a few years, after which a used several different Debian based distribution for many years, a short encounter with Gentoo in between, but I haven’t used an Arch based distribution yet.

The installation wasn’t as smooth as other distributions. For example I couldn’t set up an encrypted home partition in the installer. Well, I could, but then you I couldn’t boot up the installed system. So I ended up performing a very basic installation via the installer and then set up the specific requirements afterwards. Which wasn’t too bad. You find a lot of good documentation on Arch.

The heart of a Linux distribution is it’s software management system, which in this case is ‘pacman’. And it’s pretty cool! I’ve used it now for a couple weeks and it just works. No nasty surprises, no broken dependencies, etc. You find pretty much everything in the pacman repositories. And in the rare case you don’t, you’ll find it in the AUR, the arch user repository, from where you can install it from source. And in the very, very rare case you can’t even find it there, then just build it from source. With the help from pacman that’s so much easier than with most other distributions.

Lets start with the basics. I’ll probably add another post later covering the runit init system, I just have to get a bit more familiar with it myself.

Pacman basics

# System update:
pacman -Syu   # To force a full refresh of the package db: -Syyu and allow downgrade: -Syyuu

# Search for package:
pacman -Ss some_package  # search for installed packages: -Qs

# Get more info about package:
pacman -Si  # or -Qi for installed package

# List of all installed packages:
pacman -Ql

# Install package:
pacman -S some_package

# better: 
pacman -Syu some_package # to make sure system is up to date, otherwise packages might get out of sync

# install directly from file:
pacman -U some_package.tar.xz or
pacman -U https://example.org/some_package.tar.xz

# Remove:
pacman -R some_package 
#  or -Rsu to remove unnecessary deps too
#  or -Rc to remove all packages which depends on this package
#  add n to remove also config files

# Remove orphaned: 
pacman -Rs $(pacman -Qdtq)

# Clear cache:
pacman -Sc  # of no longer installed packages, add another c to clear all

Install from AUR

If you can’t find your software with pacman you most likely find it on the Arch User Repository.

Preparation: Install building tools and headers (equivalent to ‘build-essentials’ for Debian based distros); and create a ‘builds’ directory in your home directory:

pacman -Syu base-devel linux-headers
mkdir ~/builds

Then you can build AUR packages with

cd ~/builds
git clone https://aur.archlinux.org/some_package.git
cd some_package
# build some_package.tar.xz
makepkg -s 
# or build and install directly
makepkg -si