All things Linux

Argent Stonecutter

Emergency Mustelid Hologram
Joined
Sep 20, 2018
Messages
5,900
Location
Coonspiracy Central, Noonkkot
SL Rez
2005
Joined SLU
Sep 2009
SLU Posts
20780
I wouldn't trust writing with a third party NTFS implementation. Even with Microsoft's code base it was still fragile as heck a couple of dot releases into Windows 7.
 

Dakota Tebaldi

Well-known member
VVO Supporter 🍦🎈👾❤
Joined
Sep 19, 2018
Messages
8,684
Location
Gulf Coast, USA
Joined SLU
02-22-2008
SLU Posts
16791
Kernel 5.15 is way too new for most major distributions, meaning not in there yet.
Well, I can wait. Kubuntu 21.10 uses kernel 5.13 apparently and the previous one used 5.11, so logically the next one which comes out in April will have 5.15?
 

Dakota Tebaldi

Well-known member
VVO Supporter 🍦🎈👾❤
Joined
Sep 19, 2018
Messages
8,684
Location
Gulf Coast, USA
Joined SLU
02-22-2008
SLU Posts
16791
Linus and Luke try doing normal-person things on Linux:


Spoiler: It actually goes pretty well for both of them.

Also, at the end Linus says all the things I've been trying to say since I started test-driving Linux. :D
 
  • 1Like
Reactions: CronoCloud Creeggan

Katheryne Helendale

🐱 Kitty Queen 🐱
Joined
Sep 20, 2018
Messages
8,754
Location
Right... Behind... You...
SL Rez
2007
Joined SLU
October 2009
SLU Posts
65534
I really don't like this desktop. I don't like that there is a top bar with nothing but the date and a system menu and a whole lot of empty space that would make a perfectly good taskbar/quickbar, but whoever makes GNOME decided that taskbars are dumb and put the quickbar functionality into a second pop-up panel called the dock which intrudes into the screen from the side or bottom and takes up a bunch of extra space. I don't like the original-Windows-8-like application launcher that takes up the whole screen. I don't like having to open another full-screen menu just to switch between open application windows.
I hear you here. This is one of the reasons I use the Cinnamon desktop on Mint. If you are coming from Windows, then Mint should be right up your alley. And another nice thing about Mint is that it's built from Ubuntu, so shares its support structure and ease of use.
 

Katheryne Helendale

🐱 Kitty Queen 🐱
Joined
Sep 20, 2018
Messages
8,754
Location
Right... Behind... You...
SL Rez
2007
Joined SLU
October 2009
SLU Posts
65534
Heh - I have no knowledge of what things were really like at that point; I was a kid. I DO have memories of seeing boxes of Red Hat on the shelves at like Circuit City while I was looking at computer games, but I had no idea what it was. I remember guessing that it had to be something hacker-y in nature because I'd heard about "white hat" and "black hat" hackers by that point, but that was the limit of my understanding, lol.
When I was stationed in Japan in 2000, I was still pretty new to owning a Windows-based PC, having then-recently bought myself a laptop. I remember roaming the software shelves in the base exchange and seeing at least a couple of different Linux distros sold as boxes. Up to that point, I had been mainly using an Amiga, and had only heard about Linux from a coworker of mine years prior, shortly after it was first released. Seeing the boxes on the shelves had me intrigued, but, looking back, I'm glad my first actual Linux experience was actually years later, in 2007, when I installed Ubuntu 7.04, when the installation and overall user experience had presumably matured greatly from what would have awaited me in those 2000 box sets. I consider myself pretty tech-savvy, but seeing that video you shared about that guy who tried installing early Debian from a box, I'm really glad I waited to dip my toe in the Linux waters!
 
  • 1Agree
Reactions: Dakota Tebaldi

Ashiri

√(-1)
Joined
Sep 20, 2018
Messages
937
Location
RL: NZ
SL Rez
2007
SLU Posts
-1
I must have struck it lucky getting Caldera Linux in 1999? because it was a breeze to install... far nicer than Win95.
Distributions such as Debian or Slackware are for the competent or masochist. As for Gentoo...
 

Argent Stonecutter

Emergency Mustelid Hologram
Joined
Sep 20, 2018
Messages
5,900
Location
Coonspiracy Central, Noonkkot
SL Rez
2005
Joined SLU
Sep 2009
SLU Posts
20780
In the beginning you had to get bits and pieces from different FTP servers and put it together yourself.
 

Knutz Scorpio

Well-known member
Joined
Sep 20, 2018
Messages
391
SL Rez
2010
Joined SLU
02-15-2014
I think it was back in the late 80s or early 90s when I saw someone with a huge stack of floppy diskettes on his desk for his attempt at installing Linux. I waited until I could buy it off the shelf on CDs for my first attempt, I think it was slackware. Spent a fortune on various distros on CDs through the years but would have to surrender back to MS DOS/Windows just to get something done, typically printing.
 

Ashiri

√(-1)
Joined
Sep 20, 2018
Messages
937
Location
RL: NZ
SL Rez
2007
SLU Posts
-1
Printers are evil.

Modems too. With both printers and modems I always checked they did not need special Windows based drivers so that they would work correctly with Linux.

ETA: Getting documents from the USA could be irritating if they came preformatted for Letter.
 

Veritable Quandry

Specializing in derails and train wrecks.
Joined
Sep 19, 2018
Messages
4,394
Location
Columbus, OH
SL Rez
2010
Joined SLU
20something
SLU Posts
42
I had the opposite problem in that Windows XP SP1 and 2 did not want to work with my USB WiFi adapter, so I went with Slack until SP3 fixed the problem.
 

Dakota Tebaldi

Well-known member
VVO Supporter 🍦🎈👾❤
Joined
Sep 19, 2018
Messages
8,684
Location
Gulf Coast, USA
Joined SLU
02-22-2008
SLU Posts
16791
Arrrrggghhhh I just created a bunch of work for myself.

So if you want to install a program that isn't in your package manager (or is out of date in your package manager) - and you wouldn't know it from the way beginner-to-Linux advice channels talk, but that's actually kind of common, or at least not UNcommon - you have to ignore the advice of said noob-advice channels that urge you never to not-use the package manager, and actually go to websites to download the software yourself. If you're lucky they have a .deb file (if you're using a Debian Linux, etc, blah blah blah) of the software which works functionally the same like an installer in Windows, you double-click it and your OS has a utility that knows what to do with it and will handle the install automatically, and as a bonus you might even be able to see it in your graphical package manager if you've installed it that way, depending on things like the phases of the Moon.

But if there isn't a .deb (or whatever as appropriate) file, then you need to "compile from source". So how do you do that?

Allllllll the freaking videos and websites tell you the same thing. Even those very highly-regarded celebrity power Linuxers' videos whose comments are always filled with "wow such an amazing and useful video like always you're the best Linux channel on the whole YouTube please let me have your children" all say the exact same thing: you either pull the source files directly from git or download and extract the tar archive, and then:

./config
make
make install

If you run into missing dependencies during the make process then you find and install those dependencies and start over again. Eventually you'll get no errors and the program will install and work.

I watched MULTIPLE videos and read MULTIPLE webpages telling you to do this. So that's what I've been doing.

Except apparently you don't want to do that. A couple of days ago for the very first time I saw a video that says you don't want to use this method because when you do it this way Linux will install some of your various program files into the install directory you think you made for the program but will scatter the rest all over the labyrinthian riddle that is Linux's root file directory scheme and if you ever decide to uninstall the program and the developer didn't make a very good uninstall script (or didn't make one at all because why would anyone uninstall something amirite) then you can delete what you think is the "program folder" and that will appear to get rid of the application, but good luck hunting down all the trash that it leaves behind in your root directories. Instead, the video says something something about using "checkinstall", which.....I guess turns the source code INTO a .deb file behind the scenes and then installs it, and doing it this way lets you uninstall it more completely because your system keeps track of where everything goes when you install a .deb and can remove it all.

So why aren't we just doing that?

Oh, it's because checkinstall isn't perfect and sometimes causes problems with the install. Great.

So then FINALLY, I find a video about compiling from source that is like, almost an hour long instead of two minutes long, and the guy explains that oh, you know you don't want to just

./config
make
make install

and let the system do whatever it wants. There's a way to actually set up a folder just for a program and tell the system during the process to actually install all of the program files into that program folder instead of all over Hell's half-acre, so that if the time comes when you want to delete that program you can be confident that all of it is gone, and you also don't have to worry about accidentally deleting like shared libraries or whatnot. It's a little time-consuming sure but it's relatively simple and it works. And you don't need to install extra utilities like checkinstall. And then he installs several programs this way so that you can watch (including how he finds missing dependencies and all this).

So why aren't the rest of the YouTubers telling people this? Why was this information buried in like the dozenth video I found about compiling programs from source instead of present in all of them? Look, okay, I get it, you want to pretend the process is super-easy and only takes a minute because you're trying to attract noobs with promises that Linux is two-clicks simple. But it's really doing noobs a disservice because yeah, it might technically "work", but once that noob learns some more about their OS they're going to find out that there's a different way you're much, much better off doing this particular task and they're going to regret having spent a long time doing it in a "bad way" because now they've got their work cut out for them cleaning things up. So now I'm making plans to do a factory refresh of my OS because who knows how much junk is cluttering up my system directories. That's going to mean reinstalling all my stuff but unfortunately I don't see a faster way of doing the job. Besides, there's also some other things I've learned about that I really should've done at the beginning but have been left out of all the beginner videos (which is another separate rant) and I want to do those things too.
 
  • 1Like
Reactions: Noodles

Jolene Benoir

Hello World
VVO Supporter 🍦🎈👾❤
Joined
Sep 20, 2018
Messages
3,124
Location
Minnesnowta
SL Rez
2007
Joined SLU
Dec 2010
Arrrrggghhhh I just created a bunch of work for myself.

So if you want to install a program that isn't in your package manager (or is out of date in your package manager) - and you wouldn't know it from the way beginner-to-Linux advice channels talk, but that's actually kind of common, or at least not UNcommon - you have to ignore the advice of said noob-advice channels that urge you never to not-use the package manager, and actually go to websites to download the software yourself. If you're lucky they have a .deb file (if you're using a Debian Linux, etc, blah blah blah) of the software which works functionally the same like an installer in Windows, you double-click it and your OS has a utility that knows what to do with it and will handle the install automatically, and as a bonus you might even be able to see it in your graphical package manager if you've installed it that way, depending on things like the phases of the Moon.

But if there isn't a .deb (or whatever as appropriate) file, then you need to "compile from source". So how do you do that?

Allllllll the freaking videos and websites tell you the same thing. Even those very highly-regarded celebrity power Linuxers' videos whose comments are always filled with "wow such an amazing and useful video like always you're the best Linux channel on the whole YouTube please let me have your children" all say the exact same thing: you either pull the source files directly from git or download and extract the tar archive, and then:

./config
make
make install

If you run into missing dependencies during the make process then you find and install those dependencies and start over again. Eventually you'll get no errors and the program will install and work.

I watched MULTIPLE videos and read MULTIPLE webpages telling you to do this. So that's what I've been doing.

Except apparently you don't want to do that. A couple of days ago for the very first time I saw a video that says you don't want to use this method because when you do it this way Linux will install some of your various program files into the install directory you think you made for the program but will scatter the rest all over the labyrinthian riddle that is Linux's root file directory scheme and if you ever decide to uninstall the program and the developer didn't make a very good uninstall script (or didn't make one at all because why would anyone uninstall something amirite) then you can delete what you think is the "program folder" and that will appear to get rid of the application, but good luck hunting down all the trash that it leaves behind in your root directories. Instead, the video says something something about using "checkinstall", which.....I guess turns the source code INTO a .deb file behind the scenes and then installs it, and doing it this way lets you uninstall it more completely because your system keeps track of where everything goes when you install a .deb and can remove it all.

So why aren't we just doing that?

Oh, it's because checkinstall isn't perfect and sometimes causes problems with the install. Great.

So then FINALLY, I find a video about compiling from source that is like, almost an hour long instead of two minutes long, and the guy explains that oh, you know you don't want to just

./config
make
make install

and let the system do whatever it wants. There's a way to actually set up a folder just for a program and tell the system during the process to actually install all of the program files into that program folder instead of all over Hell's half-acre, so that if the time comes when you want to delete that program you can be confident that all of it is gone, and you also don't have to worry about accidentally deleting like shared libraries or whatnot. It's a little time-consuming sure but it's relatively simple and it works. And you don't need to install extra utilities like checkinstall. And then he installs several programs this way so that you can watch (including how he finds missing dependencies and all this).

So why aren't the rest of the YouTubers telling people this? Why was this information buried in like the dozenth video I found about compiling programs from source instead of present in all of them? Look, okay, I get it, you want to pretend the process is super-easy and only takes a minute because you're trying to attract noobs with promises that Linux is two-clicks simple. But it's really doing noobs a disservice because yeah, it might technically "work", but once that noob learns some more about their OS they're going to find out that there's a different way you're much, much better off doing this particular task and they're going to regret having spent a long time doing it in a "bad way" because now they've got their work cut out for them cleaning things up. So now I'm making plans to do a factory refresh of my OS because who knows how much junk is cluttering up my system directories. That's going to mean reinstalling all my stuff but unfortunately I don't see a faster way of doing the job. Besides, there's also some other things I've learned about that I really should've done at the beginning but have been left out of all the beginner videos (which is another separate rant) and I want to do those things too.
All of that is why I tend to prefer Arch and its Arch User Repository. I do like a number of other distros, in particular Linux Mint, but often they are missing apps that I wish to use or they are very outdated. There's nothing wrong with that, for their own reasons, mainly stability, but I usually want to be able to install the latest version and dislike it when that isn't possible or is missing altogether.

Failing that, though, I have used other methods of installation such as using PPA's (the problem with that is you end up with a lot of them, enough to make a plate of spaghetti), snap (not a huge fan, but if it is the only way, I will), flatpaks, and app images.
 

Spirits Rising

Quite Blunt
Joined
Sep 21, 2018
Messages
534
Location
Akron, OH
SL Rez
2006
Joined SLU
08/24/2014
SLU Posts
1476
All of that is why I tend to prefer Arch and its Arch User Repository. I do like a number of other distros, in particular Linux Mint, but often they are missing apps that I wish to use or they are very outdated. There's nothing wrong with that, for their own reasons, mainly stability, but I usually want to be able to install the latest version and dislike it when that isn't possible or is missing altogether.

Failing that, though, I have used other methods of installation such as using PPA's (the problem with that is you end up with a lot of them, enough to make a plate of spaghetti), snap (not a huge fan, but if it is the only way, I will), flatpaks, and app images.
It's why I like Manjaro as well - I still have access to the AUR (though at the moment not through the GUI version of Pamac, good thing I have Pacaur for the CLI).

The only problem I run into is occasionally having to edit the PKGBUILD file - had to do that when the AUR Firestorm-Bin swapped to using Libxcrypt-compat instead of gconf (the glibc that Manjaro uses right now still has/uses ownership of the .so file that libxcrypt-compat handles) - had to revert that change for my machine.

Also had to do it to properly update discord_arch_electron a couple days back - that package has since properly updated in the AUR.

There are a few packages that are ... irritatingly not set up the way they'd normally be if you're building them yourself (I avoid building myself) - ALVR being an example. Not touching that for a bit.
 
  • 1Agree
Reactions: Jolene Benoir

Ashiri

√(-1)
Joined
Sep 20, 2018
Messages
937
Location
RL: NZ
SL Rez
2007
SLU Posts
-1
Arrrrggghhhh I just created a bunch of work for myself.
Yes, messing about with compiling applications can do that. It's been a long while since I compiled any applications in Linux, the last being Firestorm back when I had a Core2 system. I've been thinking of setting up a file server on my little PC and dual booting on the main PC.
 

Bartholomew Gallacher

Well-known member
Joined
Sep 26, 2018
Messages
5,522
SL Rez
2002
Arrrrggghhhh I just created a bunch of work for myself.

So if you want to install a program that isn't in your package manager (or is out of date in your package manager) - and you wouldn't know it from the way beginner-to-Linux advice channels talk, but that's actually kind of common, or at least not UNcommon - you have to ignore the advice of said noob-advice channels that urge you never to not-use the package manager, and actually go to websites to download the software yourself. If you're lucky they have a .deb file (if you're using a Debian Linux, etc, blah blah blah) of the software which works functionally the same like an installer in Windows, you double-click it and your OS has a utility that knows what to do with it and will handle the install automatically, and as a bonus you might even be able to see it in your graphical package manager if you've installed it that way, depending on things like the phases of the Moon.
First thing to do is to check if there isn't a third party repository around which just might contain this program you need so much. Some distributions, like Debian, have a lot of such repositories around. Of course these are not endorsed by the distribution, so you are on your own there and well you decide whom you trust.

Next thing is, if there is not such a repository around, and if you don't mind to use it: use Snap or Flatpak if your distribution supports it. Look around if there's a package for that program around there.

Only if this does not lead to a satisfactory result, then you should compile the program on your own. First challenge already for some distributions: C compiler not installed by default. So installation of C compiler.

Next challenge: installation of missing development files for libraries, aka file headers.

Then: some programs nowadays don't use configure/make/make install any longer. For example Cmake or Ninja are competing build systems used by some.

Also for bigger programs, like Chromium, having enough CPU power and RAM is an issue.

And never use ./configure - it will just try to autoconfigure your source package, set it up for compilation and that's it. Nono, we don't want that! First type in ./configure --help, have a look at the feature flags, if there isn't a feature turned off by default you want to have in your binary. You can spot them, because they start with --enable or --disable mostly, like --enable-jpeg. Or the other way around.

Next thing: always hand over ./configure a custom path for installation, like e.g. ./configure --prefix=/opt/yourawesomestuff

And put that path into your PATH file. Makes deleting your own binary later much much easier, also reduces the possibility that it will screw up your distribution a lot.

Also never just type ./make, but instead ./make -jX - and X is number of your CPU cores multiplied by 2. This speeds up compile a lot, because only then all your CPU cores get something to do in parallel.

And a last tip - if unsure about ruining your system, install it on a COW file system like ZFS. Before you start doing your own stuff, make a snapshot of your whole system. If your adventure didn't work out, rollback to the beginning.
 

Argent Stonecutter

Emergency Mustelid Hologram
Joined
Sep 20, 2018
Messages
5,900
Location
Coonspiracy Central, Noonkkot
SL Rez
2005
Joined SLU
Sep 2009
SLU Posts
20780
./config
make
make install
You have to run autoconf or autoreconf or automake (check the README.md to see which), THEN ./configure, and that will tell you about the dependencies you have to come up with

But almost always if there's a configure file in the directory, it was built from configure.in or configure.am on a different system than yours.

Next we'll talk about cmake and tomcat and the other systems you have to learn about because some people don't want you to use make.