Linux users have a new way to get help, without being talked down to or berated for not putting in enough effort before asking for help!

I saw this post on Mastodon, authored by @SirBoostALot, and got permission from the author to repost it here, because I think it makes some good points:

Struggling to remember all those commands? You’re not the only one – and the Linux “experts” need to understand that. Linux “experts” who look down on GUI users are missing the point – not everyone has a photographic memory.

For over a decade I have wondered why some Linux users, particularly those of the somewhat “elitist” variety, keep saying on the one hand that everyone should consider switching to Linux but then in the next breath suggest that they need to learn how to do things from the Linux command prompt or in a terminal window. You can almost hear the contempt in their voice whenever they mention a . Worse yet, some of them seem to have the belief that everyone over the age of 5 and under the age of 75 needs to learn how to write code.

The problem with such people is that they have the same logical fallacy that many people have, which is that we think everyone’s experience is similar to our own. Objectively we know this isn’t true (well, at least most of us do) but still there is a part of us that believes that if “if I could do it, anyone can”, and worse yet, they think that if “it” was easy for them it will be easy for everyone else. And even if “it” wasn’t easy, everyone should still be able to do it with sufficient dedication and motivation. And if anyone isn’t as motivated as they are, well then they are just being lazy, or are just wanting to be “spoon-fed” information.

What I believe such people don’t realize is they have one big advantage over many of the other people they share the planet with, and that is that they have good memories, maybe even near-photographic memories. When I was in school, in the pre-internet era, many of the classes involved a lot of rote memorization. I would generally pull A’s and B’s in any subject that wasn’t highly dependent on memorization, but if memorization was involved I was lucky if I could pull a D rather than failing the class completely. Some other students would ace those kinds of classes with no sweat, because they had good memories, and I imagine some of them may have gone on to become computer programmers. Nowadays they would be the kind of people who love using the command line rather than a GUI.

See, it takes a good memory to remember all those commands and all their options. In a GUI you are shown all your options, and often if you mouse over an option there is explanatory text. And some of us really need that, because again, we don’t have good memories. We simply can’t remember all those commands and options, even if we just read a man page or help screen ten minutes ago. Some people get around this by having “cheat sheets” of the commands and options they use the most, but then that limits them to a subset of all available commands (to the ones they have previously used).

When people with poor memories ask a question about how to do something in a Linux forum and the resident Linux “expert” respond in a dismissive way, implying they didn’t search hard enough or are just wanting to be “spoon-fed” answers, that is really upsetting. Not only do they probably have no idea which search terms to use, but that is not how users respond to each other in Windows or MacOS forums. In those forums, if someone asks a question and you don’t know the answer or don’t want to give it, you keep your hands off the damn keyboard and move on to the next post. This is a skill many Linux users don’t seem to have developed, at least in some forums. If you feel it is beneath you to “spoon feed” an answer to someone then just move on, because once you have insulted someone they will never appreciate anything you have to say, and they will always think of you as an arrogant donkey’s patootie.

And that brings us to , and why it is becoming so popular. A lot of Linux “experts” seem to hate AI, or see no use for it. They correctly point out that it sometimes hallucinates answers, and if you ask it a coding question you have to check pretty much every single line of code it spits out because that code may contain errors. AI’s are actually pretty stupid sometimes; they will make logical errors that most humans with any experience would not make. But still, AI’s are becoming more and more popular, and leading to decreased usage of forums such as StackExchange, for better or worse, and in my opinion there are two big reasons why.

First, they know most or all Linux commands, and have nearly perfect memories. If you have an average to poor memory, and you ask them to write a bash script to do something, they will like spit out code in a few seconds that would have taken you hours to research, and they will sometimes use Linux commands or statements that you had no idea even existed (and as time goes by, fewer and fewer of those are hallucinated commands that don’t actually exist). You may not have even known that there was a Linux command that will do a certain thing, but the AI does, particularly if it was trained on code.

Second, they always just deliver answers, They don’t chide the user for not having researched the question before asking, or for not reading a FAQ or a man page or some other piece of documentation that the user may not have known existed, or might not understand even if he had somehow stumbled across it. And they never complain that the user is being lazy, nor do that answer in that condescending tone that a few Linux “experts” that haunt various online forums seem to have adopted. I have never understood why some Linux “experts” complain that users want to be “spoon fed” answers, as if that’s a bad thing. Not everyone wants their operating system to be a puzzle, with new challenges to solve on an intermittent and unpredictable basis. Especially in the case of people coming from MacOS or Windows, most people are wanting their operating system to be as trouble-free as possible.

If you have a good memory, and if you really enjoy being presented with new challenges that can only be solved after a lot of effort and frustration, then you won’t get that most people want their operating system to be as invisible as possible, and to “just work”. And that most folks don’t want to ever have to use the command line, and they certainly don’t want to “learn” an operating system. When was the last time you heard someone say that people should “learn” Windows, or “learn” MacOS? I’m not saying that never happens, but it is a pretty rare thing to hear.

What would really be helpful, and I know some old-time Linux guys will consider this anathema, is a freely-accessible AI trained specifically on Linux and related subjects (the various Linux shells and maybe even programming languages commonly used in Linux, but with preference given to subjects that matter to users). Someplace that a new Linux user can go and ask questions and just get correct answers all day long. But also one that would that would learn and receive correction based on user input, for example if it offers a solution and nearly every user that tries that solution reports that it doesn’t work, the AI would start to prefer alternative solutions. In other words, don’t force the user to “learn Linux”; instead let the AI do it and then it can assist users in their time of need, with no condescension or other snarkiness.

And for those who would complain that AI’s use too much power, I say, seriously? How much power is consumed when users search web page after web page for an answer, mostly finding irrelevant information? And isn’t it worth using a little power so people can get their issues solved and get on with their lives, rather than spending an entire day or two trying to resolve one issue (and being insulted in the process)? And maybe to actually help those of us with terrible memories, so that we are not made to feel as if we are unworthy of using Linux?

Not all Linux “experts” are of the type mentioned in that post. I have encountered many that have been willing to go out of their way to be helpful! But, unfortunately, I have also encountered the type described above, and that’s never been pleasant.

But speaking of AI, here’s what I’d like to see. Imagine having an AI assistant that can just dive into all your past notes, articles, and even those old BASIC programs from back in the day, and then use that knowledge to help you out whenever you need it. That’s like having a personal librarian, tech support, and brainstorming partner all rolled into one!

I mean, think about it – you could just ask that AI thing, “Hey, remember that old file renaming program I had back in the day? Can you help me update that for modern Linux?” and BOOM, it’s got your back. No more digging through piles of old files and trying to piece it all together. The AI just takes care of it, using all the info it’s got stored up from your past experiences.

And the security aspect is super important, too. I mean, we all want to keep our personal stuff, well, personal, you know? So having something that runs locally and doesn’t send your data out to the internet is a must. That way, you can just feed it whatever you want, and it’ll be your own little knowledge base, ready to help you out whenever you need it.

The more I think about it, the more I’m convinced this is the kind of thing that could really change the game. It’s like having a personal assistant, but one that’s tailored specifically to your own experiences and knowledge. Someone’s probably already working on something like this as I write this. At least I hope so, because it could make our lives a whole lot easier!

A handy tip for entering long and/or complicated commands at the Linux command prompt (when using the bash shell) from a Mastodon user

Saw this tip on Mastodon from user Stephan (@durchaus@mastodon.social) and thought it worth passing along:

When you are about to write a long and complicated command in bash, then hit CTRL+x CTRL+e to enter an editor window in which you can write the command with your default editor. The command will be executed immediately after the file is saved and the editor is closed.

(Link to post)

I never knew you could do this. And it was only a year or two ago that I found about about CTRL+r which lets you do a text search for commands in your history (so you don’t need to keep pressing the up arrow). Then again I am not a big command line user, but when I do need to use it, tips like these can be quite helpful IF I can remember them when I need them!

Thanks to Stephan for sharing this tip!

Link: A Primer for Scheduling Cron Jobs in Linux

Cron jobs in Linux are simple scheduled tasks that can be set to run commands at specific times. Unfortunately, the syntax isn’t the easiest to use or remember, but in this month’s column I’ll share some examples and tips to help you better understand and utilize cron jobs.

Full article here:
A Primer for Scheduling Cron Jobs in Linux (ServerWatch)

Note that if you have Webmin installed on a system, it may be easier to use that to schedule cron jobs.

Link: How to protect Apache with Fail2ban

Around 2 years ago I wrote an article about fail2ban.

Fail2ban is an intrusion prevention framework written in the Python programming language. It is able to run on POSIX systems that have an interface to a packet-control system or firewall installed locally (such as, iptables or TCP Wrapper).

Fail2ban’s main function is to block selected IP addresses that may belong to hosts that are trying to breach the system’s security. It determines the hosts to be blocked by monitoring log files (e.g. /var/log/pwdfail, /var/log/auth.log, etc.) and bans any host IP that makes too many login attempts or performs any other unwanted action within a time frame defined by the administrator.

Today I want to show you some configurations that you can use to improve the security of your Apache.

Read the rest here:
How to protect Apache with Fail2ban (Linuxaria)

Installing Linux in place of OS X on an older Mac

Former title: Installing Ubuntu Linux (or Mythbuntu) in place of OS X on a Mac Mini (changed because this article is now less specific to Ubuntu or the Mac Mini).

Important
This is an edited version of a post that originally appeared on a blog called The Michigan Telephone Blog, which was written by a friend before he decided to stop blogging. It is reposted with his permission. Comments dated before the year 2013 were originally posted to his blog.

All computers have their day in the sun, and then something new comes along that makes them less desirable. I have a Mac Mini that I acquired sometime around the time OS X Leopard came out, probably in 2009. As time went by it became slower and slower and I finally replaced it. However, it seemed a bit of a shame to have it just sitting in a closet doing nothing, so I decided to see if I could put Ubuntu Linux 12.04 (at that time the most recent LTS version) on it. I started out by trying to follow the instructions on this page (the Single Boot/MBR option) but ultimately it turned out to be not nearly that complicated. NOTE THAT THIS WILL NOT GIVE YOU A DUAL BOOT SYSTEM; THIS IS REPLACING OS X WITH UBUNTU LINUX, AND ALL YOUR EXISTING DATA ON THE HARD DRIVE WILL BE ERASED!!! YOU HAVE BEEN WARNED!!!

Also: THIS IS FOR EXPERIMENTAL AND INFORMATIONAL PURPOSES ONLY. This is what worked for ME. This MAY or MAY NOT work for you. It may even brick your system (if that should happen, try doing a PRAM reset before you totally panic). It did not do that to me, and I have never heard of anyone having that problem, but your hardware may be a bit different than mine, so I make NO guarantees!!! Here’s all I wound up having to do:

  • Connected a USB keyboard and mouse.
  • Before proceeding I made sure I had the latest firmware upgrade – that is very important since you can’t do it once you have installed Ubuntu, and the rest of this might not work if you have old firmware. Another thing you can’t do is disable the startup chime so if you don’t want that after Linux is installed, find a way to turn it off before proceeding (I wanted it, and it’s used in the instructions that follow, so I didn’t search for a way to disable it, and I don’t recommend you do either).
  • Inserted my old OS X Leopard installation CD (any OS X installation CD that works with the Mac should be fine for this procedure)
  • Rebooted while holding down the “C” key
  • Accepted the default language (English) and pressed Enter
  • From the top menu bar, I selected Utilities | Disk Utility
  • In Disk Utility I selected the internal hard drive in the left hand side panel. You have to select the drive itself (it shows the brand name of your drive), not the partition
  • Clicked the Partition tab, selected 1 Partition under the Volume Scheme (instead of Current)
  • Clicked the Options button and selected “Master Boot Record”
  • Clicked the Continue button (this erases the hard drive)
  • When the above finished, I quit Disk Utility, then quit the installer
  • The Mac rebooted — at that point I clicked and held the leftmost mouse button until the CD ejected
  • Inserted the Ubuntu Install CD, then power cycled the Mac Mini
  • At the startup chime I pressed the “C” key to boot from the CD
  • When the Ubuntu CD boots, you are given the option to install Ubuntu directly, or to try Ubuntu (which takes you to the Ubuntu desktop). As it turns out, I could have simply ran the installer. Instead, I did it from the desktop because I was following some instructions that said you have to type in some things during the install, and you need to have access to a terminal window to do that. But when I tried to type those things in, Ubuntu rejected them with errors, so I just let it complete the install (answering the questions it asked during the install). I did NOT select any special partitioning, etc. – I pretty much did a plain vanilla installation, generally accepting the defaults except where they weren’t appropriate or weren’t what I wanted.
Important
NOTE: If, when you boot into the Ubuntu installation CD, you are greeted with a screen that asks you to “Select CD Rom boot type” but you cannot select anything because neither the keyboard nor the mouse are functional, or if you get a black screen at some point early in the install process, then you may need a special build of Linux intended for use with 64-bit Macs that use a 32-bit EFI. Such models were mostly manufactured in the last half of the 2000’s. In that case you can try the solution offered here, if you can understand the instructions, or you can see if you can find a suitable ready-made .iso file at Matt Gadient’s site (note that even though it says “late 2006 models” in the title, those builds should work with late 2006 and later 64-bit Macs that use a 32-bit EFI).

When the install was finished (after quite some time), I rebooted. The only unusual thing was that after the reboot, the screen remains white for about 20-30 seconds right at the start of the reboot. Then after that delay, it finally decides it will load Ubuntu. That still happens and while it makes the startup take a bit longer, I’ve seen worse. The reason for this (and a possible fix) can be found here: Reducing the 30 second delay when starting 64-bit Ubuntu in BIOS mode on the old 32-bit EFI Macs.

Oh, and the sound was muted by default for some reason, and the volume slider set all the way down. After I unmuted it and turned up the volume slider, the sound worked normally.

The wired network connection worked fine. I don’t use WiFi here so I did not test that. If you have issues with those, or any other issues see GNU/Linux Debian on a Macbook Pro 11-1, which contains some additional hints that may make things run more smoothly (or work at all) after installation. I would use any hints on that page only if you have one of the specific issues that the author had. Another page with possibly useful additional hacks to make this work is Mac book pro 11.3 Linux customization (2): grub and hardware, which may be helpful if you have issues with an unrecognized Intel Iris GPU or Thunderbolt, but again I would only use these if you really need them, since this article is from 2014 and such hacks may no longer be necessary (by the way, this blog has several other pages of rather technical hacks for using Linux on a Mac Book Pro, and while many users will not need any of them, you can search that site for the phrase “Mac book pro 11.3 Linux customization” to find the other articles).

The only real hitch I found was that it wouldn’t boot if I didn’t have a video display connected! I have no idea why that was the case but there is a hardware workaround described on this page. Hope you didn’t misplace your Mac Mini’s DVI to VGA adapter! I had a 100 Ω resistor handy and that seems to work fine (it’s probably about 40 or 50 years old but what the heck, carbon resistors don’t change value that much just sitting in a junk box). If you can’t find the DVI to VGA adapter, I have read that at least one user placed a resistor directly into the DVI port between pins C2 and C5; you can use this pinout diagram to find those pins. However, I have not tried this, so I make no guarantees. Either way, the point is that the resistor goes between the analog green signal output and the analog ground return. If you can’t find the adapter, can’t get this to work, or if you just don’t want to mess around with a resistor, you can try a DVI Emulator/Dummy Plug that includes an EDID chip, that may be available from Amazon or eBay. A DVI 1920×1080 or 1920×1200 model should be sufficient. Such a device fools the Mac Mini into thinking a display is connected.

After I had this running for a year or two I read that Ubuntu doesn’t control the system fan properly. You can do this in some newer versions of Ubuntu (and possibly other Linux builds) to get fan control to work. First, try using your chosen Linux distribution’s Software Center or package manager to search for and install the macfanctld package (after you check for updates). If that doesn’t work, try this (assuming you’re using a Ubuntu or Debian based distribution):

sudo apt install -y macfanctld

On my system at least, it appeared that this did cause the fan to run a bit faster but not enough that I really noticed it. Once it is installed, you can type man macfanctld at a Linux command prompt to get configuration instructions. Changes to temperature limits and minimum fan speed are made in the file /etc/macfanctl.conf.

Also, although I have not personally done this, I have read on a couple of pages that you can enable automatic reboot after a power interruption by adding one line to /etc/rc.local (this may be Ubuntu-specific, but you can always remove it if it doesn’t work):

setpci -s 0:1f.0 0xa4.b=0

If you have smb or samba installed (it is installed by default in many distributions) and find that your samba/smb log files fill up with complaints such as “Unable to connect to CUPS server localhost:631 – Connection refused”, here is a workaround. In the [global] section of /etc/samba/smb.conf add these lines:

printing = bsd
printcap name = /dev/null

Then from a command prompt restart samba:

sudo service smbd restart

Long ago I read a post on Reddit where the author said, “After months of headaches and tinkering I’ve finally found the stupid way to let Linux work properly on Macs. Just add acpi_osi=Darwin as boot parameter.” Unfortunately I did not save a link to that post so I have no idea if still valid advice, and therefore I would only try it if something (particularly Thunderbolt) isn’t working right and you can’t resolve it any other way. If you understand Linux kernel stuff see the final paragraphs of this page for a discussion of this setting, also see Section 2 on this page. If you don’t use the Thunderbolt port then you probably don’t need this, and you may not anyway.

By the way, I had intended to try Linux Mint but they no longer offer a version that fits on a CD, and I still have about a gazillion blank CDs (plus I already had Ubuntu 12.04 burned to CD) and since the Mac Mini doesn’t have a BIOS in the traditional sense, there is no way to install from a USB stick, so Ubuntu it was. For what I plan to do with this, having Mint would be no advantage. Many older Macs have issues booting from a USB memory stick, although one site I read said that they might be able to if you use UNetbootin to transfer the Linux image file to the USB stick. I did not attempt that, so I cannot say whether that works. I got it to work using a CD, but I do realize that CD and DVD burners are becoming somewhat of a scarce item these days.

Speaking of DVD’s, if you have burned some Linux distro (or maybe even Windows) to a DVD and the Mac Mini refuses to read it, it may be because it’s the wrong kind of DVD. To quote from a comment in a Reddit thread entitled “[GUIDE] How to install Windows 10 on a Late 2006 iMac (And every other old Mac)“,

IMPORTANT: iMac’s internal DVD drive is very picky when booting from DVDs. Don’t use DVD-RW or DVD+RW. Don’t use DVD+R. It will only work correctly with DVD-R discs!

Most blank DVD’s have an inscription on them somewhere indicating what type of DVD they are, although depending on how it was inscribed on the DVD it may be hard to read. Someone I know actually had this issue with one old Mac Mini, where it refused to read a DVD until they found some DVD-R’s and burned the new operating system to one of those.

Note that nowhere above did I mention a program called rEFIt, nor did I mention BootCamp. I didn’t need either of those.

Had this not worked I could have always reinstalled Leopard, but again, for what I was doing with this system (running a Tvheadend backend, for one thing) Linux was probably a better choice, but I still wanted to be able to use this as a backup regular desktop computer, should the need ever arise.

One final note: This is the xorg.conf I used with my “headless” system. It came from this blog post and I did have to save it as /etc/X11/xorg.conf and not the filename shown in the post (in newer versions of Linux I believe this location may have moved yet again, so you may need to search to find the correct location for the xorg configuration for your distro). Note this is only something you might consider using if you have installed a version of Linux that includes a desktop; you don’t need this if you have a server-only version of Linux that has no desktop installed (one where you do everything from a Linux command prompt). Note also that this won’t work if you are running the Wayland display server; you must be using the X window system (a.k.a. X11 or Xorg) for it to work:

Section "Monitor"
  Identifier "Monitor0"
  Modeline "1920x1080_60.00"  172.80  1920 2040 2248 2576  1080 1081 1084 1118  -HSync +Vsync
  Modeline "1024x768_60.00"  64.11  1024 1080 1184 1344  768 769 772 795  -HSync +Vsync
EndSection
Section "Screen"
  Identifier "Screen0"
  Device "VGA1"
  Monitor "Monitor0"
  DefaultDepth 24
  SubSection "Display"
    Depth 24
    Modes "1920x1080_60.00" "1024x768_60.00"
  EndSubSection
EndSection

A real help for Linux users with bad memories: Aliaser — take control of your aliases on Linux

 

Important
This is an edited version of a post that originally appeared on a blog called The Michigan Telephone Blog, which was written by a friend before he decided to stop blogging. It is reposted with his permission. Comments dated before the year 2013 were originally posted to his blog.
Tux, the Linux penguin
Image via Wikipedia

Here’s a program that may be useful for those of you who, like me, sometimes find ourselves at a Linux command prompt trying to recall the syntax of a command we use frequently (because, you know, it would never have occurred to the designers of Linux to actually implement commands with names that have a clear meaning in plain English):

Alias are a great tool to help increment your productivity on the terminal with bash (or any shell program you’re using), but usually we are too lazy to think at what are the most common, or long commands that we use frequently and prepare an alias for them.

And so someone has done a small piece of software to do this job: aliaser

Aliaser helps you identify frequently typed commands and creates bash aliases for them. Aliaser analyses your bash history and helps you identify commands that you use frequently.

Full article (with installation instructions) here.

One thing they forgot to mention is that once you’ve added an alias, it won’t actually be available for use until you log out and then log back in.  Also, you can delete the aliaser file and temporary directory from your /tmp directory once installation is complete.  If you ever want to uninstall aliaser, just remove the three lines added to your .bashrc file, remove the ~/.aliaser directory, and remove the /usr/bin/aliaser file.

One way I find this useful is to make commands I can’t remember into ones that that I can remember.  For example, I did this:

aliaser add processes “ps awx”

The Linux purists are probably rushing to comment that I just turned a six character command into a nine character one.  Yes, BUT, I can actually remember the word “processes”, whereas I cannot remember the options I need to use after “ps” to get the output I want. The designers of Linux seem to not realize that some of us users have really bad memories.  Another use for this is turning arcane Linux commands into the equivalent Windows commands that you’re familiar with.  You could do this:

aliaser add dir “ls -al”

So that when you type “dir”, you get a directory listing similar to what you are used to.

If you can’t even remember the aliases you’ve created (yeah, my memory really is that bad some days), just use aliaser show to see all the aliases you’ve added.

The Linux equivalent of Little Snitch, ZoneAlarm, and similar per-application firewalls?

Important
This is an edited version of a post that originally appeared on a blog called The Michigan Telephone Blog, which was written by a friend before he decided to stop blogging. It is reposted with his permission. Comments dated before the year 2013 were originally posted to his blog.

EDIT: This article is very old and outdated. For more current information, see OpenSnitch: The Little Snitch application like firewall tool for Linux.

If you are a Mac user, you’ve probably heard of Little Snitch.  It’s a commercial (as in, not free) program that lets you allow or deny connections to the Internet from individual applications.  One reason for using such a program is to detect software that should have no reason to connect to the Internet nevertheless attempting to do so.  For example, you download a free screensaver (dumb move to start with) and it sends all the personal information it can find on you to some group of hackers on the other side of the world.  A program like Little Snitch would let you know that the screensaver  is trying to connect to the Internet, and allow you to deny that connection.  In the Windows world, I believe that ZoneAlarm has a similar capability, and it’s also a commercial (as in, not free) program.

Leopard Flower personal firewall for Linux OS screenshot
Leopard Flower personal firewall for Linux OS screenshot

It appears that these is a similar program for Linux users, and it IS free!  It’s called Leopard Flower and it’s described as a “Personal firewall for Linux OS (based on libnetfilter_queue) which allows to allow or deny Internet access on a per-application basis rather than on a port/protocol basis.”

Looking at the screenshot it appears to have very much the same per-application blocking functionality you’d get in one of those other programs.  I have not personally tried it yet, but I wanted to create a post about it so if someday in the future I am trying to remember the name of this program, I’ll know where to find it (yes, this blog does sort of serve as my long-term memory!).  🙂

Since this article was originally published, I have been made aware of another similar application called Douane: Linux personal firewall with per application rule controls – here are a couple of screenshots:

Douane personal firewall for GNU/Linux screenshot
Douane personal firewall for GNU/Linux screenshot
Duane configurator screenshot
Duane configurator screenshot

The only downside to this one is that as of this writing the only available package is for Arch Linux but if you want to try to build it for a Ubuntu or Debian system, they provide a page showing the needed dependencies.

There is an older similar program called TuxGuardian but apparently is hasn’t been updated since 2006, so I have no idea if it will even work with current versions of Linux. And as for you Android users, try the NoRoot Firewall app.

If your Linux-based PC with NVIDIA graphics started booting to a black screen or text only, here is the fix — maybe!

 

Important
This is an edited version of a post that originally appeared on a blog called The Michigan Telephone Blog, which was written by a friend before he decided to stop blogging. It is reposted with his permission. Comments dated before the year 2013 were originally posted to his blog.
Image representing NVidia as depicted in Crunc...
Image via CrunchBase

I’ve seen this happen several times now on Ubuntu-Linux based systems that have NVIDIA graphics.  What happens is that “Update Manager” pops up and tell you there are updates for your software, and you accept them.  It then tells you that your system has to be rebooted.  And when you do that, you get no video, or text only.  What probably happened was that the updates you installed included an update to the Linux kernel, and the NVIDIA graphics driver currently installed on the system was compiled against the OLD kernel.

Note that this generally can only happen if you manually updated the NVIDIA graphics driver at some point. If you always installed it from the standard repositories for your distribution, you’ll probably never see this issue. So a word to the wise — when you finally get around to doing an upgrade of your Linux distribution, try to avoid manually installing the NVIDIA graphics driver. Instead, let the distribution pull it from its repository. After that, you should not have this issue in the future. By the way, if you currently are running Ubuntu, we recommend upgrading to Linux Mint rather than a newer version of Ubuntu. Linux Mint is very similar to Ubuntu, but leaves out some of the things that users seem to hate about newer releases of Ubuntu. More to the point, they are not currently talking about switching their base graphics system from the X window server system to a new display manager, which I have a feeling might cause problems for some NVIDIA graphics users.

But if you’re not yet ready to do a full reinstall of Linux, the fix for this problem is easy IF you had the foresight to set up SSH access to your Linux system BEFORE the trouble started.  If you didn’t, and you’re not a true Linux geek, you may be kind of screwed.  So if you’re reading this and your system is working fine, and you haven’t yet set up SSH access, you may want to do that.  There are several sites that tell you how to do that; here are two that I found using Google:

Basic SSH Setup On Ubuntu 10.04 Lucid Lynx Using OpenSSH Server
SSH—OpenSSH—Configuring

If you didn’t do this beforehand, you may still be able to do it if you can get to a command prompt.

Anyway, the actual fix is to (re-)install the latest NVIDIA driver for your system. They will be compiled against the new Linux kernel and then everything should work fine. To find the correct NVIDIA driver, go to the NVIDIA Driver Downloads page, and use the dropdowns to select the correct driver for your system.  Download it to your local system, then upload it to your Linux PC (if you have SSH access working then you can use an SFTP client, such as WinSCP or Transmit, to upload your driver file).  Once you have it on your PC, from a command prompt navigate to the directory where you put the driver and then change the permissions to make it executable:

sudo chmod +x driver_upgrade_script_filename

Now try running the script (it should have a .run extension):

sudo ./driver_upgrade_script_filename

It should not complain that the Gnome Display Manager or KDE Display Manager is running (if it were, you wouldn’t be in a state of near-panic right now), but if you were just doing a regular update you’d have to do this when the GDM/KDM is stopped. For a guide that covers that scenario, see How To Install Official Nvidia Drivers in Linux, or just know that to stop the display manager,

sudo /etc/init.d/gdm stop

should stop the Gnome Display Manager, or if you’re using KDE then the command would be

sudo /etc/init.d/kdm stop

Most sources I’ve seen suggest that you answer yes to any questions the installer may ask. The only one I’d be cautious about is letting it create a new xorg.conf if you are using a customized one (which you may well be if you’ve used any of my previous HTPC-related articles). If you have edited xorg.conf, then I’d make sure you at least have a backup before letting the installer create a new one, so you can revert back to your custom one (or compare the two and insert your customizations into the new one) if necessary.

Under Ubuntu, you may get a message similar to “Provided install script failed”. That will happen every time you update the NVIDIA driver this way and it is normal. Just ignore it and continue the installation. If you get “Error locating kernel source”, run  sudo apt-get install kernel-source  from the command prompt, then run the driver upgrade script again.

When the installer has successfully finished, reboot the system and when it comes back up, hopefully you should be happy again!

Problems you may encounter when attempting to install phpMyAdmin on your Centos server, and how to solve them

 

Important
This is an edited version of a post that originally appeared on a blog called The Michigan Telephone Blog, which was written by a friend before he decided to stop blogging. It is reposted with his permission. Comments dated before the year 2013 were originally posted to his blog.

This article was originally published in August, 2011 and may contain outdated information.

phpMyAdmin logo
Image via Wikipedia

I just spent an interesting couple of hours trying to install phpMyAdmin on an Asterisk server running CentOS 5.5. As I encountered each problem and solved it, I had to wade through a lot of pages that weren’t applicable to my installation, etc. Since many readers of this blog run similar configurations I thought I’d just list the hiccups I encountered, and what I had to do to solve them. Note that some distributions come with phpMyAdmin already installed, so make sure you don’t already have it before you try to install it!

NOTE: Think carefully about whether you really want to follow the instructions below, particularly if it requires adding a repository. If you do that, make sure you only install the software you actually need from that repository, then disable it (set enabled=0). If you don’t do that, you could easily get into a situation where some of your curent software (such as PHP) simply will not upgrade no matter what you do. And if you are running a PBX “install and go” distribution, they may specifically warn you not to add repositories, or it will break your installation, so don’t do it!

If you do anything suggested below, you do it at your own risk!

• yum install phpmyadmin doesn’t work — try using the dag repository — there are several pages on the Web that tell how to do this. Use Google to search for “how to enable the dag repository” (without the quotes) if you need help. The basic idea is you need to create a file called /etc/yum.repos.d/dag.repo (with the proper permissions, ownership, etc.) and put something like this inside:

[dag] name=Dag RPM Repository for Red Hat Enterprise Linux
baseurl=http://apt.sw.be/redhat/el$releasever/en/$basearch/dag
gpgcheck=1
enabled=1

BUT you also need to install a GPG key, and getting THAT can be a bit of a problem. Some instructions will tell you to do this:

rpm –import http://dag.wieers.com/rpm/packages/RPM-GPG-KEY.dag.txt

That link no longer works, and you have to do this instead:

rpm –import http://apt.sw.be/RPM-GPG-KEY.dag.txt

But for some people even THAT doesn’t work, in which case it’s suggested you use wget to obtain the file, then import it:

wget http://apt.sw.be/RPM-GPG-KEY.dag.txt
rpm –import RPM-GPG-KEY.dag.txt

I’m being a bit non-specific because the instructions could change, and I’d prefer you find a current reference on how to enable this repository. Also, some may prefer to install RPMforge, which is a collaboration of Dag and other packagers. Regardless of the effort involved, I do suggest you install phpMyAdmin using yum, because it will install everything in the correct locations for CentOS, and you don’t have to compile it or anything like that.

Note that when you do install phpMyAdmin using yum, it may also install required dependencies such as libmcrypt and php-mcrypt (another advantage to using yum).

• You don’t have permission to access /phpmyadmin/ on this server.

Go to /etc/httpd/conf.d/phpmyadmin.conf
Under the line:
Allow from 127.0.0.1
You could add a line to allow access from your local network, for example:
Allow from 192.168.0.0/255.255.255.0
(But use values appropriate to your network).

If you are accessing the box remotely, then add a line allowing access from your IP address. Be VERY careful, because you don’t want to let the entire world into your databases!

• Existing configuration file (./config.inc.php) is not readable.

If you’re doing this on a system running FreePBX, scroll down to where I discuss changing the ownership of all phpMyAdmin-related files and directories to be the same as the MySQL user. Otherwise, the easiest solution (though not necessarily the most secure) is to change the permissions of the file /usr/share/phpmyadmin/config.inc.php from the default of 640 to 644 (add user read permission). If no one can get to your system from outside your local network, this probably isn’t an issue, but if anyone has a better idea on this, feel free to leave a comment.

• “Error
The configuration file now needs a secret passphrase (blowfish_secret).”

Open /usr/share/phpmyadmin/config.inc.php and find this section:

* This is needed for cookie based authentication to encrypt password in
* cookie
*/
$cfg[‘blowfish_secret’] = ‘oh my this is such a wonderful passphrase‘; /* YOU MUST FILL IN THIS FOR COOKIE AUTH! */

Insert any phrase you like (within reason) between the second pair of single quotes in the last line shown above (but don’t use ‘oh my this is such a wonderful passphrase‘, I just inserted that as an example.  Be creative!).  Don’t worry, this isn’t something you’ll actually have to type in every time you want to use phpMyAdmin.

– Access denied for user ‘root’@’localhost’ (using password: YES)

You don’t login as root, you use your MySQL username and password. In FreePBX-based systems these can be found in /etc/amportal.conf, in the AMPDBUSER and AMPDBPASS settings. BUT… if you enter a wrong user name before logging in correctly, it may have already set a cookie with that username and password and then you won’t be able to get in even if you DO use the correct username and password. The solution is to clear all browser cookies for the address of your server, then try again — and make sure you get it right this time! 😉

I will note here that you can avoid some of these cookie-related issues, probably including those mentioned above, by going into /usr/share/phpmyadmin/config.inc.php and finding this section:

/* Authentication type */
$cfg[‘Servers’][$i][‘auth_type’] = ‘cookie’;

If your system is behind a hardware firewall or is otherwise VERY secure, you could change the auth_type from ‘cookie’ to something else, such as ‘http’. This will save you a lot of frustration during the login process, but at the possible expense of making your database less secure.  For those concerned about security, a document on the phpMyAdmin wiki advises you to “See the page on Security or the multi–user sub–section of the FAQ for additional information, especially FAQ 4.4.”  I personally found their security documentation rather useless, because they make a lot of suggestions but provide no specific examples of how to implement those suggestions.  Anyway, I personally feel that as long as a system is behind a good firewall that doesn’t permit anyone on the “outside” to access phpMyAdmin, ‘http’ is a good compromise between a security model that might drive you crazy (‘cookie’) and one of the other models that’s fairly insecure, such as ‘config’ (which some consider insecure because it stores your server username and password in plain text).  However, if your system is otherwise VERY secure and you just don’t want to have to enter a password to use phpMyAdmin, then it is possible to change the ‘auth_type’ to ‘config’ and (in the same config file), look for these lines:

/*
 * End of servers configuration
 */

And just above those lines, insert these lines:

$cfg[‘Servers’][$i][‘user’] = ‘mysqluser’;
$cfg[‘Servers’][$i][‘password’] = ‘mysqlpassword’;

Change mysqluser and mysqlpassword to the correct vales for your system (on a FreePBX-based system, these are the values in /etc/amportal.conf mentioned above).  I do not recommend using ‘config’ because it is less secure (be sure to read the page on Security mentioned above), but it’s up to you to decide how secure you want your system to be.

(I’m fully aware that any objections to storing the user and password values in plain text in the phpMyAdmin config.inc.php fall a bit flat when you realize the same values are stored in plain text in amportal.conf, but I also feel as though the fewer places those values are exposed, the better.  Why give potential attackers one more place to find this information?)

• phpMyAdmin – Error
Cannot start session without errors, please check errors given in your PHP and/or webserver log file and configure your PHP installation properly.

Check your /var/log/httpd/error_log – in my case, the first error message of each set contained a phrase like “open(/var/lib/php/session/sess_somerandomstring, O_RDWR) failed: Permission denied (13)” and I figured that the problem was another permissions issue.

On some sites I have found a suggestion that you change the ownership of all phpMyAdmin-related files and directories to be the same as the MySQL user (in the case of an Asterisk/FreePBX system, that would be asterisk:asterisk). On a FreePBX-based system, you could try this (check to make sure these are the correct paths before doing this):

chown asterisk:asterisk /usr/share/phpmyadmin -R
chown asterisk:asterisk /var/lib/php/session -R

If that doesn’t resolve the issue (or you’re doing this on a system that’s not running FreePBX), perhaps the easiest solution (though not necessarily the most secure) is to change the permissions of the offending file. If you have the same issue I had, try changing the permissions of the directory /var/lib/php/session from the default of 770 to 777 (add full user permissions).

Strangely, this one didn’t show up until after I’d successfully run phpMyAdmin a few times. Go figure. Also, after fixing this, I had to delete cookies again (as mentioned in the previous item) before I could log in, but that was when I still had the ‘auth_type’ set to ‘cookie’ (another reason I decided to change that to ‘http’).

Found and solved any other “gotchas” while installing phpMyAdmin under CentOS? Think I could have solved a problem in a better way? Feel free to share your solutions in the comments.

EDIT: There is one other thing that can happen after you install or update PHP on your system (as might happen if you let a FreePBX-based distribution do an upgrade).  You may start seeing PHP warning messages such as:

PHP Warning:  PHP Startup: mcrypt: Unable to initialize module
Module compiled with module API=20050922, debug=0, thread-safety=0
PHP    compiled with module API=20060613, debug=0, thread-safety=0
These options need to match
 in Unknown on line 0

If that happens try updating the dependencies that came with phpMyAdmin, for example:

yum update libmcrypt
yum update php-mcrypt

It was the second of those two that vanquished the PHP warning messages for me.

And why did I NEED to install phpMyAdmin, you ask?  Well, because someone (ahem) made a slight configuration error and caused an endless loop, that within the space of about ten seconds or so, generated over a THOUSAND bogus records in the ‘asteriskcdr’ (Call Detail) database.  The only easy way to I knew of at the time to clean them out was phpMyAdmin (since I don’t “speak” MySQL), but I don’t recommend you attempt something like that unless you know what you’re doing, because one wrong move and you could delete your entire FreePBX database (trust me, that would be a VERY bad thing!). In retrospect I probably could have used Webmin, since it also has the ability to access the MySQL database, but I didn’t think of that at the time.