[TUT] Homebrew for Linux

  • brew.sh/ Der fehlende Paketmanager für macOS (oder Linux)
    docs.brew.sh/Homebrew-on-Linux Homebrew on Linux
    Yes

    DEPRECATED, DO NOT USE!

    UGOS Pro hat eine defekte Paketverwaltung!

    Einführung

    Homebrew ist ein Paketmanger für MacOS (und Linux). Homebrew ist minimal-invasiv. Mit brew installierte Pakete existieren in einer Parallelwelt. Die bestehende Betriebssystem-Installation wird nicht angefasst. Homebrew wurde entwickelt um neue Versionen von Kommandozeilen-Tools wie python oder rsync parallel zu bestehenden Versionen zu installieren.

    Warnung

    • Alle Angaben sind ohne Gewähr.
    • Ich hab die Installation vor mehreren Monaten gemacht und kann mich deshalb nicht mehr an die Details erinnern.
    • Homebrew ist nur für fortgeschrittene Kommandozeilen Junkies!

    Installation

    Vorbereitung

    Im GUI von UGOS PRO ein neues Administrator-Konto eröffnen für linuxbrew (ja, das ist der Name des neuen Benutzer).

    Der invasive Teil

    Via SSH auf die NAS zugreifen, mit dem regulären Administrator-Konto. Es müssen erst ein paar wenige Pakete zum bestehenden Betriebssystem hinzugefügt werden (der invasive Teil der Installation). Bei sudo wird nach dem Passwort des originalen Admins gefragt.

    Code
    sudo apt-get update
    sudo apt-get install build-essential procps curl file git

    Installation von Homebrew

    Für die Installation von Homebrew wechseln wir zum neuen Administrator.

    Code
    sudo su - linuxbrew
    /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

    Beim ersten sudo wird das Passwort des originalen Admins erwartet. Danach wird bei der Frage nach sudo das Passwort des neuen Admins linuxbrew verlangt. Optional kann nach der Installation noch der Suchpfad angepasst werden. Bitte folgende Zeile zu /home/linuxbrew/.profile hinzufügen: eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv)"

    Aufräumen bitte!

    Nach meiner Erstinstallation hat sich Homebrew beschwert. Abhilfe schafft der Doktor.

    Code
    brew doctor
    brew update

    Im Falle einer Panikattacke

    Sollte bei obiger Installation etwas schief gehen, einfach das Installationsverzeichnis löschen und nochmals versuchen.

    Code
    # to start from scratch, delete everything
    # rm -rf /home/linuxbrew/.linuxbrew

    Hallo rsync!

    Pakete werden mit brew installiert, ohne sudo. Ausser brew fragt explizit nach sudo Rechten.

    Code
    brew install rsync

    Mittels rsync aus Homebrew lässt sich die rsync push Restriktion der System-Installation umgehen :P. Folgender Befehl transferierte meine Mediathek von einer Synology 1517 zur neuen UGreen 4800 Plus.

    Code
    # rsync push auf die 4800 Plus
    rsync -avhW --no-compress --progress --rsync-path="/home/linuxbrew/.linuxbrew/bin/rsync" \
      --exclude @eaDir MediaVault/ admin@ugnas:/volume2/MediaVault

    That's all for today, folks!

    Edited once, last by ACiAtuRA: Tutorial hat Status DEPRECATED. UGPS Pro ist nicht länger ein funktionsfähiges Debian OS. (January 14, 2026 at 7:56 PM).

  • The sudo apt-get update and sudo apt-get install produced quite a bit of output that I feel cannot just be ignored:

    ....

    Get:43 https://deb.debian.org/debian-security bookworm-security/non-free-firmware amd64 Packages [688 B]
    Get:44 https://deb.debian.org/debian-security bookworm-security/non-free-firmware Translation-en [472 B]
    Fetched 26.8 MB in 4s (7,149 kB/s)
    Reading package lists... Done
    arjen@DXP2800-5FD4:~$ sudo apt-get install build-essential procps curl file git
    Reading package lists... Done
    Building dependency tree... Done
    procps is already the newest version (2:4.0.2-3).
    file is already the newest version (1:5.44-3).
    You might want to run 'apt --fix-broken install' to correct these.
    The following packages have unmet dependencies:
    build-essential : Depends: libc6-dev but it is not going to be installed or
    libc-dev
    Depends: gcc (>= 4:10.2) but it is not going to be installed
    Depends: g++ (>= 4:10.2) but it is not going to be installed
    Depends: make
    Depends: dpkg-dev (>= 1.17.11) but it is not going to be installed
    curl : Depends: libcurl4 (= 7.88.1-10+deb12u14) but 7.88.1-10+deb12u12 is to be installed
    exiv2 : Depends: libexiv2-27 (= 0.27.6-1) but it is not going to be installed
    git : Depends: git-man (> 1:2.39.5) but it is not going to be installed
    Depends: git-man (< 1:2.39.5-.) but it is not going to be installed
    libopengl0 : Depends: libglvnd0 (= 1.7.0-2101~22.04) but 1.6.0-1 is to be installed
    E: Unmet dependencies. Try 'apt --fix-broken install' with no packages (or specify a solution).
    arjen@DXP2800-5FD4:~$

    If it wasn't my Ugreen DXP2800 but a workstation I would probably do what it suggests, but in this case I would like to know what to think about all those messages. The DXP2800 has been updated with the latest firmware at the end of December 2025.

    I have only made a few small changes to /etc/sshd_config in order to force key-based access and the use of ED25519 keys.

    I've been looking to get backintime working with this new NAS, but there's another reason I am looking for an unrestricted rsync on the NAS.

    Rsync works over ssh from a remote server, which is how I have been doing off-site backups with my previous Netgear RN212 for ages. However, the UGOS version of rsync does not work with NAS storage on volume2 (some durable small Optane M.2 SSD's), nor with any user's home folder, so I am quite handicapped. On volume1 rsync works OK over ssh with key-based access.

    The home folder, otoh, is the only location smartphone photo backups can be stored. I been looking for ways to manouver past the blockades that Ugreen appears to have built into UGOS. It has been suggested by another user that rsync would work with volume2 storage in daemon mode. However, that will require a VPN tunnel to keep the data safe, as daemon mode has no encryption. I do have a VPN to the network that the NAS is on, so I am not totally helpless, but I do start to feel like this is such a mire ...

    Thanks for any suggestions ...

    Arjen in Helsinki, Finland, and Bergen near Alkmaar, Netherlands

  • Installing Debian is not what those instructions were about ... I would loose UGOS apps and the services they (can) provide for users on my LAN (just me and my wife). Got her to drop MS-Windows some years ago, so we have only Linux machines ...

    I'd rather not start from scratch, but follow these directions:

    How to use BackinTime with an Ugreen NAS – Digital home of George Ruinelli

    They require linuxbrew to install an alternative rsync that backintime can work with. BiT has a setting to use an alternative rsync on the server, so that seems the way to go.

    I know of many alternative systems to replace UGOS with, but I have enough to do as it is. For the time being I'll have to keep the RN212 online as well and wait until perhaps Ugreen opens up rsync with volume2 storage. It feels like an oversight that it doesn't work.

    I have not tried the rsync daemon mode yet. It feels like a left-over from times when the internet was still a relatively safe space.

    It is probably very easy to destroy UGOS on the NAS and end up with a non-functioning NAS. I like to tread very carefully.

  • It is probably very easy to destroy UGOS on the NAS and end up with a non-functioning NAS. I like to tread very carefully.

    That's why you can't just start “apt-get upgrade.” Ugreen has disabled this method of updating the system. If you need the latest tools, there's no way around Debian.


    Why don't you connect a “jump server” in between? That would also solve your problem.

    Meine Hardware


    • DXP6800PRO | 2 x CT16G48C40S5.M8A1 16 GB 4800 MHz | 3 x Seagate ST12000VN0008-2YS101 12TB | 3 x Samsung SSD 870 EVO 1TB | 4 x Samsung SSD 990 PRO 2TB

    Edited once, last by alter Mann: Ein Beitrag von alter Mann mit diesem Beitrag zusammengefügt. (January 14, 2026 at 1:30 PM).

  • I've heard of jump servers.

    I have two locations. One with a normal wideband internet connection (public IP) and just my own firewall in the router ('home'). The other is a cottage at 1500 km from home. It's behind a CGNAT.

    I run a PiHole server at 'home' and it is also is the endpoint of a permanent 'reverse' SSH tunnel from the cottage location (Raspberry Pi's at both ends). The DNS function (pihole & unbound) is totally separate, although on the same little raspberry pi machine at home (with its own UPS).

    I think the PiHole server at the home end providing the entry point to the tunnel from/to the server at the cottage and its LAN is what is sometimes called a jump server.

    The Pihole also has Wireguard to provide safe VPN access to the 'home' LAN (also from the off-site copy server at the cottage since a few weeks).

    The server in the cottage that has been doing the rsync off-site backup copying accesses the RN212 (and now the DXP2800) without VPN or jump server. Just going out through the CGNAT with rsync and ssh to a domain name from no-ip for my home LAN. Push and pull copying ... I could replace that with the VPN connection from the cottage to home and run rsync in daemon mode or so I have been led to believe. It still needs to be shown to work (with /volume2 and/or /home folder storage).

    I have to try and think about what you mean by adding a jump server and how it would address the obstacles I experience with UGOS.

    Thanks for the suggestions anyway!

    Edited once, last by ArjenR49 (January 14, 2026 at 2:14 PM).

  • Forget the jump server, I didn't read it carefully.
    If I understand correctly, the problem is rsync's access to the home directories with the photos?
    Rsync must run with a user who has access to the directories.

    Meine Hardware


    • DXP6800PRO | 2 x CT16G48C40S5.M8A1 16 GB 4800 MHz | 3 x Seagate ST12000VN0008-2YS101 12TB | 3 x Samsung SSD 870 EVO 1TB | 4 x Samsung SSD 990 PRO 2TB

  • Rsync (with ssh at least) has two problems accessing files on the DXP2800:

    1. it doesn't work with paths like /home/<some user>.

    2. it doesn't work with paths like /volume2/<some storage pool>.

    I figured using ssh with a command I can run a script on the DXP2800 to copy photos from /home folders to volume2 storage. volume2 is two Optane M.2 ssd's. Big enough to temporarily store photo backups from our smartphones. So that they then could be copied to the off-site server. However this is blocked by item 2. I would have to use /volume1 instead ... (so my 2x32 GB Optane M.2's would be more or less useless).

    BackinTime (I have it run on workstations and servers) has a problem with the UGOS rsync which can be overcome by installing a different rsync on the DXP2800 as per the instructions using linuxbrew ... I don't know linuxbrew and I had no idea that it wasn't going to work with UGOS. I have now asked the author of https://www.ruinelli.ch/how-to-use-bac…h-an-ugreen-nas for comment.

    After installing the alternative rsync one needs to move the home folder for the linuxbrew user to /volume1 by editing the passwd file. That is a useful trick to know with respect to the photos backup from smartphone, too. The backup could possibly be sent to home folders on /volume1.

    I remember that on the Netgear RN212 (ReadyNAS) home folders of the users of that NAS turned out to be just as useless as seems to be the case for the DXP2800. I never used them for anything.


    P.S. rsync from a server to the Ugreen NAS works only with targets like in the next example:

    rsync -rltvih -e "ssh -p abcde -i $HOME/.ssh/id_ed25519" /media/POOL/TEST/fghi.txt rsync@<domain name>:/BiT_TEST/test/
    BiT_TEST is a shared folder on /volume1. However, /volume1 is not needed and cannot be used in the path name.

    rsync in the target description is a user name I created for the purpose.

    In the command following an ssh connection or in the ssh terminal session you can and have to use /volume1 in the path name. Also /volume2 and /home is OK.

    I hope I got this right. It's been a few weeks since I did all that testing.

    Edited once, last by ArjenR49: Merged a post created by ArjenR49 into this post. (January 14, 2026 at 3:01 PM).

  • Hi ArjenR49 I saw your post on my blog. It is a pitty that the homebrew approach does not work anymore. I must say I have no experience with homebrew beside what I learned in this post!

    When I installed it a month ago it still worked :( Maybe there is a different, less intrusive approach to get it working?

    One Idea I just had is to run ssh and rsync in a docker container. One then could mount the location where he actually wants the data to be. I just am not sure if hardlinks will work.

  • I have played around a bit and got it working.

    I created a docker container using following docker-compose file:

    Notes:

    • It only works when using port 22 directly -> would need to find way to use it on a non-default port -> it works now (not sure why it failed before)
    • I used a mount bind, but it might also work with a normal volume mount
    • I had to install rsync manually in the container: apk add rsync -> needs to be added to a new image
    • Like on a normal system, I had to copy the ssh key beforehand: sudo ssh-copy-id -i xxx_id_rsa root@192.168.1.x -> not persisted, must be somehow added to the image
    • It creates new inodes (bu the next backup re-uses them again, so that can be solved like documented in https://github.com/bit-team/backi…ment-3622378794:

      Code
      11403392 -rw-r--r--   20 1003     users         1343 Jan  9  2007 20260107-220001-751//backup/etc/wodim.conf
      11403392 -rw-r--r--   20 1003     users         1343 Jan  9  2007 20260109-220002-751//backup/etc/wodim.conf
      11403392 -rw-r--r--   20 1003     users         1343 Jan  9  2007 20260110-220001-751//backup/etc/wodim.conf
      11403392 -rw-r--r--   20 1003     users         1343 Jan  9  2007 20260111-220002-751//backup/etc/wodim.conf
      11403392 -rw-r--r--   20 1003     users         1343 Jan  9  2007 20260111-221825-751//backup/etc/wodim.conf
      11426272 -rw-r--r--    4 root     root          1343 Jan  9  2007 20260114-234855-751//backup/etc/wodim.conf
      11426272 -rw-r--r--    4 root     root          1343 Jan  9  2007 20260114-235036-751//backup/etc/wodim.conf
      11426272 -rw-r--r--    4 root     root          1343 Jan  9  2007 20260114-235052-751//backup/etc/wodim.conf
      11426272 -rw-r--r--    4 root     root          1343 Jan  9  2007 20260114-235125-751//backup/etc/wodim.conf       

    I think it would be a nice solution to pack it all into a docker container. that would allow that also beginners can set it up it without risk. Main challenge is how to get the SSH key into the container so it gets persisted. I tried it on a mounted volume, but it then always requested the password. I think it is due to wrong file permissions.

    Edited once, last by caco3 (January 15, 2026 at 12:27 AM).

  • This threads contains 7 more posts that have been hidden for guests.

Participate now!

Don’t have an account yet? Register yourself now and be a part of our community!