On 03/28/2016 08:32 AM, Alan Ansell wrote:
> However, it had never really occurred to me that it might matter whether I
> install from standard user with elevated privileges (sudo) or from root
> itself. Maybe it doesn't.
Maybe it does.
It depends.
If you are installing a system package, library, utilities, anything
from the system repositories and many sanctioned user repositories then
you definitely need ROOT. You can't write to the ssytem directories and
update the RPM database otherwise.
If you are installing some 'toy' or 'at risk' item into ${HOME} then do
it with lesser privilege.
> Two 'major' things have occurred since. First was trying to install
> google-earth. Install failed
Probably because you weren't using root and it was a system item.
> so I got on the internet. Found a proposed
> solution that involved hacking the source code to remove two offfending
> lines. The subsequent install threw up a few problems (including fonts)
Ah, that explains the modified fronts in your other post :-(
> and
> it never really ran properly, so I took it out but since it did install a
> ton of stuff in and around /usr/bin maybe that's one problem.
Maybe.
On a side not: I'm an openSuse user, not a RedHat user, but its been
decades since I needed to compile in C anything that I could otherwise
get from an opensuse repository. The sources I play with are in Perl or
Python or Ruby and I do so in ${HOME}. I'm happily running DT from a
opensuse repository. No need for source.
All the worth while Linux distributions have pretty good installers, but
differ to a degree. There is a lot of interoperability, but then again
each needs to "add value" in their own way so some things diverge. In
some ways Redhat is more "leading' than Suse.
But there should be no need to compile things that are part of the
approved distribution and supported in approved repositories unless
you've mad a mess in the installation in some other way. The
installation managers and the RPM format, as you've encountered, take
care of dependencies. Its only when you start playing with rogue
repositories or doing development and lacking the relevant "-devel"
packages that you hit problems. (And yes, you need the "-devel"
packages for the scripting languages.)
If you are running a properly installed system from one of the
mainstream distributions and using the package manger to make sure the
dependencies are present for any package, and you have the proper
repositories configured then you should be having none of these
problems. I'd go further and say that you should not need to be
compiling darktable.
That's not to say that DT will always run; there could be problems with
your graphics driver or your desktop manager configuration, but those
are configuration and driver problems. You may need to, for example,
install and up to date nVidia driver for the card you use. Or AMD, or
Intel.
> Other could be
> that just yesterday, in trying increasingly diverse means of solving my
> problem with all of this, I ran another "dnf autoremove". It found 1.8Gb of
> 538 files to remove as apparently 'orphaned' but which I suspect may have
> been partly to do with incorrect install paths.
Possibly; possibly not.
The definition of "Orphaned" might simply be that the they were leaves
on the dependency tree, nothing was dependent on them. That doesn't
always mean they are now redundant or superfluous.
For example lots of things depend on the shell, on 'rm'. But what
binary depends on DT?
Many tools need to be used with discretion.
____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to [email protected]