Thursday, 16 January 2025

Adventures with turborepo

I was accessing a project that builds a nextJs website. 
It was originally launched with npm run dev.
The project used devcert as a local proxy to allow HTTPS access locally.

Normally I would launch it in an Ubuntu environment running on Windows with WSL2.

This time I ran it direct from Windows. After all, node is multiplatform.

However I started hitting some issues:

Running devcert to install the certificates.

requires OpenSSL and sh

devcert requires OpenSSL and the sh command. These are installed with Git for Windows. Ensure that "C:\Program Files\Git\usr\bin" are part of your path.

devcert errors with Node 22

devcert was errorring because it uses a deprecated createCipher method. This is now deprecated in Node 22. To workaround this I used nvm, installed Node 21, and ran the installation code. This was a bit of a hack.

conditional sudo code

the installation script on Linux requires sudo permission, so package.json defined the following script:
"certificates:install": "sudo node ./src/install-certificates.js"

this wouldn't work on Windows. So instead a conditional script launcher was created:
"certificates:install": "node ./src/run-elevated.js ./src/install-certificates.js",

// This script detects the Operating System. If it is not Windows, it applies the sudo command before running the target script.
// Usage:
//    node ./run-elevated.js [script.js]
const { exec } = require("child_process");
const os = require("os");

const isWindows = os.platform() === "win32"; // Detect the OS
const targetScript = process.argv[2]; // The first argument after the script name

if (!targetScript) {
  console.error(
    "Error: No script specified. Usage: node run-script.js [your-script.js]",
  );
  process.exit(1);
}

// Command to execute
const command = isWindows
  ? `node ${targetScript}`
  : `sudo node ${targetScript}`;

// Execute the command
exec(command, (error, stdout, stderr) => {
  if (error) {
    console.error(`Error: ${error.message}`);
    return;
  }
  if (stderr) {
    console.error(`Stderr: ${stderr}`);
    return;
  }
  console.log(`Output: ${stdout}`);
});

stdio not accessible

When creating a certificate, devcert prompts for a password.  The code above wouldn't work because stido was not available to the launching console.

The code was changed to:

// This script detects the Operating System. If it is not Windows, it applies the sudo command before running the target script.
// Usage:
//    node ./run-elevated.js [script.js]
const { spawn } = require("child_process"); // Consider cross-spawn (https://www.npmjs.com/package/cross-spawn)
const os = require("os");

const isWindows = os.platform() === "win32"; // Detect the OS
const targetScript = process.argv[2]; // The first argument after the script name

if (!targetScript) {
  console.error(
    "Error: No script specified. Usage: node run-elevated.js [your-script.js]",
  );
  process.exit(1);
}

// Spawn the child process and inherit stdio to enable user interaction
const child = isWindows
  ? spawn("node", [targetScript], {
      stdio: "inherit", // Inherit stdio streams from the parent process
    })
  : spawn("sudo", ["node", targetScript], {
      stdio: "inherit", // Inherit stdio streams from the parent process
    });

// Handle errors if the child process fails
child.on("error", (error) => {
  console.error(`Error: ${error.message}`);
});

// Handle when the child process exits
child.on("exit", (code) => {
  console.log(`Child process exited with code ${code}`);
});


cross-spawn should still be considered as an improvement.

devcert uses environment variables on Windows. turbo doesn't pass them

The final problem was that devcert tries to locate the certificates using environment variables on Windows:

function win32 (name) {
  if (process.env.LOCALAPPDATA) {
    return path.join(process.env.LOCALAPPDATA, name)
  }

  return path.join(process.env.USERPROFILE, 'Local Settings', 'Application Data', name)
}

If you launch the project with npm run dev, the environment variables are passed.

This was verified by adding the following script to package.json and running it with npm run dev and npx turbo run dev. The first logged the environment variables and the second didn't.

{ "scripts": { "check-env": "node -e \"console.log(process.env)\"" } }

If you launch it with turbo dev, they are undefined, and the code errors.

The solution was to add the following to turbo.json:

"globalPassThroughEnv": ["LOCALAPPDATA"],

Sunday, 5 January 2025

Building a home NAS on a Raspberry Pi: RAID versus Rsync



The Pi 5 consumes about 4W under normal conditions. With a spinning 2.5" SATA drive it consumes about 6 Watts.

You can connect USB drives, but OpenMedia will not allow a RAID across USB drives. This is because they can be troublesome as they power down under energy saving conditions. They can also be disconnected leading to data loss.

Another option is to use a Pi Compute Module with a Compute IO board and a SATA adapter. However currently you can only use the PCI-E SATA adapter with the Compute Module 4 and the IO board only supports USB 2.

The Compute 5 IO board supports USB 3 but doesn't provide the PCI-E interface for SATA.

Use RAID or not?

There are opinions that RAID 1 provides a false sense of security. 

Delete a file, and it is deleted on both drives.

You could use a single drive, and periodic Rsync to another connected drive.

It allows disks of different sizes.

You could use addititve Rsync, so that only new files are added. If you accidentally delete a file, it may still exist on the destination drive.

You could plug in different destination drives for taking them offsite.


Power consumption

This article says that an SSD will draw 0.2A to 0.3A. A good option is a lower power option is to have a primary SSD for always-on reads and a 3.5" powered & spinning SATA for backup.

Running 8W will cost approximately 6p a day, (at 30p/kWh), or £21 per year. Far cheaper than cloud solutions.


Saturday, 4 January 2025

FreeBSD mirror error: "geom: Not all disks connected"

Whilst copying data to my FreeBSD RAID1 mirror, one of the disks errored:

Jan 4 07:08:44    freenas     kernel: GEOM_MIRROR: Device RAID1: provider ad0 disconnected.
Jan 4 07:08:44    freenas     kernel: GEOM_MIRROR: Request failed (error=5). ad0[WRITE(offset=1632316556288, length=131072)]
Jan 4 07:08:44    freenas     kernel: ad0: FAILURE - WRITE_DMA48 status=51<READY,DSC,ERROR> error=10<NID_NOT_FOUND> LBA=3188118402

I also got this error:
 "geom: Not all disks connected."

ad0 had errored, leaving the RAID operating only with ad1.
The solution was posted here:

gmirror forget RAID1

gmirror status
      Name    Status  Components
            mirror/RAID1  COMPLETE  ad1

gmirror remove RAID1 /dev/ad0
gmirror: No such provider: ad0.

gmirror insert RAID1 /dev/ad0

gmirror status
      Name    Status  Components
            mirror/RAID1  DEGRADED  ad1  ad0 (0%)

Wednesday, 1 January 2025

Resizing a UFS partition

UFS is really only supported on FreeBSD

Use a FreeBSD live CD.

gpart delete -i 1 /dev/da1

gpart destroy -F /dev/da1

gpart create -s GPT /dev/da1

gpart show

Note that it has a 40 sector (40*512 bytes) offset to allow for the GPT table. 
The previous sector I had started at 34 bytes and creating a partition warned that "added, but partition is not aligned on 4096 bytes"


gpart show

40 7814037088 da1 GPT (36T)
40 7814037088     - free - (3.6T)

gpart add -t freebsd-ufs -b 40 -s 7814037088 /dev/da1

gpart show

40 7814037088 da1 GPT (36T)
40 7814037088     1 freebsd-ufs (3.6T)



Friday, 20 December 2024

sed on FreeBSD file listing

sed can be used to modify a FreeBSD file listing with regular expressions, allowing you to convert an ls into a csv, for example.

Given a file listing like:

-rwxr-xr-x   1 Andrew.Potts  wheel   7417965 Apr 12  2023 file1.jpg

-rwxr-xr-x   1 Andrew.Potts  wheel  13536345 Apr 12  2023 file2.jpg

-rwxr-xr-x   1 Andrew.Potts  wheel  11488336 Apr 12  2023 file3.jpg

You can create and test the regular expressions on Sed Tester: https://sed.js.org/

Note however, early versions of FreeBSD don't support characters like \s, \S or \d.
Instea you must use commands like [:alpha:]

ls -la | sed -E 's/^([-drwx\+]*)[[:space:]]+([0-9]+)[[:space:]]+([[:alpha:].]+)[[:space:]]+([[:alpha:]]+)[[:space:]]+([[:digit:]]+)[[:space:]]+([[:alpha:]]+[[:space:]]+[[:digit:]]+[[:space:]]+[[:digit:]]+)[[:space:]]+(.+)$/\7,\5,\6/p'



's/\(.* [A-Za-z]* [0-9]*\) \([0-9]*\) \([a-z|A-Z]* [0-9]* [0-9]*:[0-9]*\) \(.*\)/\4,\2,\3/'

 ls -la | sed -E 's/^([-drwx\+]*)[[:space:]]+([0-9]+)[[:space:]]+([[:alpha:].]+)(.+)$/\3/p'

Capture 

([-rwx]*)\s+(\d+)\s+(\S+)\s+(\S+)\s+(\d+)\s+(\w+\s+\d+\s+\d+)\s+(.+)$

--regexp-extended --expression='s/^([-rwx]*)\s+([0-9]+)\s+(\S+)\s+(\S+)\s+([0-9]+)\s+(\S+\s+[0-9]+\s+[0-9]+)\s+(.+)$/\5,\6,\7/'


These chained commands:

    find "/mnt/Data/MyMedia/" -type f -exec stat -f "%N@%z@%SB" -t "%Y/%m/%d %H:%M:%S" {} \; | sed -E 's|^(.*[\\/])([^\\/]+)@(.*)@(.*)|\2,\1\2,\3,\4|'

  • finds the files in the directiories and subdirectories
  • retrieves the full path, size and formatted date, separated by the @ character
  • runs a regex over the result to extract the filename (\2), path and filename (\2\1), size (\3) and date (\4), comma-separating the results



Tuesday, 30 July 2024

Outlook PWA opens email links in Edge

 I prefer Outlook Personal Web App (PWA) over the thick client.

The problem I was having was that clicking on a http link within an email opened Edge rather than my default Chrome.

I double-checked the default App settings in Windows 11 and everything was set to Chrome.

Sadly, the instructions for setting the browser in the Outlook thick client application do not apply to the Personal Web App. Even though you set it to "use the default browser" that setting is not respected in the PWA, nor can you set it in the PWA.


The solution is to actually set the PWA as a Chrome app, not an edge app. 

In Chrome, navigate to https://outlook.office.com/mail/.
The button to install it as a PWA is on the search bar. Click on that.



Saturday, 22 June 2024

Mounting an external FAT32 USB drive with FreeNAS

Mount the FAT32 disk

dmesg | grep da1
mkdir /mnt/usbhdd
mount_msdosfs -o large /dev/da1s1/ /mnt/usbhdd

rsync -recursive --ignore-existing --human-readable --progress "/mnt/usbhdd/Matthew Potts iPhone 12 Photo Backup/" "/mnt/Data/My Media/Matthew/Photo Backup/"

To copy a file pattern:

rsync --include='**/*.MOV' -recursive --ignore-existing --human-readable --progress "/mnt/usbhdd/Matthew Potts iPhone 12 Photo Backup/" "/mnt/Data/My Media/Matthew/Photo Backup/"

To count files:

find . -maxdepth 1 -type f -print | wc -l