[Fink-devel] ld option to supress multiple definitions (from apple's list)

2002-04-23 Thread Chris Zubrzycki

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

should we maybe start to use this in our package's LDFLAGS? It might 
make finding real errors easier, especially with packages like xfree, 
where I get a *lot* of these warnings. I had wondered if there was an 
option like this, and now I found it:-)

Begin forwarded message:
 From: Matt Watson [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Subject: Re: ld option to supress multiple definitions?

 On Monday, April 22, 2002, at 09:04  PM, Wade Williams wrote:

 There's a -undefined suppress option for ld, but no 
 -multiplydefined suppress option that I could find.

 You want -multiply_defined suppress.

 man ld:

-multiply_defined treatment
   Specifies how multiply defined symbols  in  dynamic
   libraries when -twolevel_namespace is in effect are
   to be treated.  treatment can be:  error,  warning,
   or suppress.  Which cause the treatment of multiply
   defined symbols in  dynamic  libraries  as  either,
   errors,  warnings,  or  suppresses  the checking of
   multiply  symbols  from  dynamic   libraries   when
   -twolevel_namespace  is  in effect.  The default is
   to  treat  multiply  defined  symbols  in   dynamic
   libraries  as  warnings when -twolevel_namespace is
   in effect.


- -chris zubrzycki
- - --
PGP public key: http://homepage.mac.com/beren/publickey.txt
ID: 0xA2ABC070
Fingerprint: 26B0 BA6B A409 FA83 42B3  1688 FBF9 8232 A2AB C070


Remember: it's a Microsoft virus, not an email virus,
a Microsoft worm, not a computer worm.

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (Darwin)
Comment: For info see http://www.gnupg.org

iD8DBQE8xVHm+/mCMqKrwHARAqaoAKCRZpRtTkFnozvaSr825yhE7MWVXwCgnY6d
rUIDiM8BbRkDN/jQIoGZwq8=
=rTow
-END PGP SIGNATURE-


___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



[Fink-devel] Re: First SplitOff package

2002-04-23 Thread David R. Morrison

Hi Jeremy.  The line

Replaces: %n ( 1.2.2-2)

isn't actually needed, because a package is always free to Replace earlier
versions of itself.

Also, you should probably move

Depends: gnome-libs, gtkmm (= 1.2.8)

into the shlibs package.  The reasoning is this: a user might uninstall the
main package but keep the shlibs package, and your dependencies probably
would still be needed.  In general, BuildDepends goes in the main package,
Depends goes in the shlibs package.

Last comment:  if you could add

BuildDependsOnly: True

to the main package, that will make life easier later.  (This doesn't do
anything yet, but later, there will be checking as to whether a package
is accidentally Depending on something that it shouldn't.)

  -- Dave



Update of /cvsroot/fink/shared-libraries/splitoff
In directory usw-pr-cvs1:/tmp/cvs-serv7138

Added Files:
gnomemm-1.2.2-2.info gnomemm-1.2.2-2.patch 
Log Message:
First SplitOff package... If anyone could verify that it's the correct format/files, 
that would be great!

--- NEW FILE: gnomemm-1.2.2-2.info ---
Package: gnomemm
Version: 1.2.2
Revision: 2
Maintainer: Jeremy Higgs [EMAIL PROTECTED]
Depends: gnome-libs, gtkmm (= 1.2.8), %n-shlibs (= %v-%r)
Replaces: %n ( 1.2.2-2)
Source: http://prdownloads.sourceforge.net/gtkmm/%n-%v.tar.gz
Patch: %f.patch
UpdateLibtoolInDirs: scripts
SetCPPFLAGS: -no-cpp-precomp
ConfigureParams: --mandir=%p/share/man
InstallScript: make install DESTDIR=%d
SplitOff: 
  Package: %N-shlibs
  Replaces: %N ( 1.2.2-2)
  Files: lib/libgnomemm-1.2.9*.dylib
  DocFiles: AUTHORS COPYING ChangeLog NEWS README TODO

DocFiles: AUTHORS COPYING ChangeLog NEWS README TODO
Description: C++ interface for the GNOME libraries
DescDetail: 
gnomemm is a set of C++ bindings for the GNOME libraries that provide
additional functionality above GTK+ and gtkmm.

License: GPL
Homepage: http://gtkmm.sourceforge.net/




___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



[Fink-devel] parallel downloads...

2002-04-23 Thread Chris Zubrzycki

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

How hard would it be to add code to perform x number of downloads at 
once, where x is set in the config field? just wondering, for people who 
have fast connections.

(would it be too hard to do for a perl beginner?)

- -chris zubrzycki
- - --
PGP public key: http://homepage.mac.com/beren/publickey.txt
ID: 0xA2ABC070
Fingerprint: 26B0 BA6B A409 FA83 42B3  1688 FBF9 8232 A2AB C070


Twice blessed is help unlooked for. --Tolkien

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (Darwin)
Comment: For info see http://www.gnupg.org

iD8DBQE8xWUY+/mCMqKrwHARAjYmAJ4wIwTaTckP/DAhmwUwQFAGzviRJwCaAhVP
DID+ht981MWR4W1B9wyyREA=
=4Jko
-END PGP SIGNATURE-


___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] Is abiword still maintained ?

2002-04-23 Thread Peter O'Gorman

Well, 1.0 has been released :-)
April 19th 2002 seems to have been the release date.

http://sourceforge.net/project/showfiles.php?group_id=15518release_id=25198

Peter

On Tuesday, April 23, 2002, at 04:12  AM, Jeff Whitaker wrote:

 On Mon, 22 Apr 2002, Jin Zhao wrote:

 Abiword is close to it's 1.0 release now. Took a look at fink 
 and found
 it not updated for a pretty long time. Is abiword still 
 maintained as a
 Fink project? Why no updates for such a long time, ie we are 
 waiting for
 a native OSX port or the abiword 1.0 release?


 Just haven't had the time and probably won't be sufficiently 
 motivated to
 find the time until a 1.0 release.

 -Jeff

 --
 Jeffrey S. Whitaker Phone  : (303)497-6313
 Meteorologist   FAX: (303)497-6449
 NOAA/OAR/CDC  R/CDC1Email  : [EMAIL PROTECTED]
 325 BroadwayWeb: www.cdc.noaa.gov/~jsw
 Boulder, CO, USA 80303-3328 Office : Skaggs Research Cntr 1D-124



 ___
 Fink-devel mailing list
 [EMAIL PROTECTED]
 https://lists.sourceforge.net/lists/listinfo/fink-devel



___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



[Fink-devel] more splitoffs

2002-04-23 Thread David R. Morrison

I have made another group of splitoff packages for the shared libraries
project, deposited into the shared-libraries/splitoff module.  I will
be moving these to the unstable tree after I have added the appropriate
BuildDepends entries to other packages, to make sure nothing breaks.
It might take a while.

The new packages are:

fnlib-0.5-3.info
freetype-1.3.1-5.info
freetype-hinting-1.3.1-4.info
giflib-4.1.0-3.info
gimp-1.2.3-5.info
imagemagick-5.4.1-4.info
imagemagick-5.4.4-2.info
imagemagick-nox-5.4.4-2.info
imlib-1.9.10-6.info
libungif-4.1.0b1-3.info
netpbm-9.24-4.info

There are also a few packages where dependencies needed to be adjusted
because of these splitoffs; they are:

enlightenment-0.16.5-7.info
gdal-1.1.5-4.info
windowmaker-0.80.0-4.info

  -- Dave

___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] ld option to supress multiple definitions (from apple's list)

2002-04-23 Thread Eric Norum


On Tuesday, April 23, 2002, at 06:21 AM, Chris Zubrzycki wrote:

 should we maybe start to use this in our package's LDFLAGS? It might 
 make finding real errors easier, especially with packages like xfree, 
 where I get a *lot* of these warnings. I had wondered if there was an 
 option like this, and now I found it:-)



Would it be better to use the two-level namespace support of ld instead 
of fighting against it?   I'm not trying to be confrontational, I just 
would like to know the best way of handling linker issues.  It seems to 
me that the use of -flat_namespace just raises the need for a bunch of 
other flags like -undefined suppress and -multiply_defined suppress.  
I'm concerned about all the `suppress' options hiding problems that 
really ought to be fixed properly.

--
Eric Norum [EMAIL PROTECTED]
Department of Electrical Engineering
University of Saskatchewan
Saskatoon, Canada.
Phone: (306) 966-5394   FAX:   (306) 966-5407


___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] ld option to supress multiple definitions (from apple's list)

2002-04-23 Thread Justin Hallett

as far as I'm concerned if you can compile it, and run it why worry about
the warnings.  I think it would be better to try and keep as close as the
author intended it to be.

[EMAIL PROTECTED] writes:
Would it be better to use the two-level namespace support of ld instead 
of fighting against it?   I'm not trying to be confrontational, I just 
would like to know the best way of handling linker issues.  It seems to 
me that the use of -flat_namespace just raises the need for a bunch of 
other flags like -undefined suppress and -multiply_defined suppress.  
I'm concerned about all the `suppress' options hiding problems that 
really ought to be fixed properly.

¸.·´^`·.,][JFH][`·.,¸¸.·´][JFH][¸.·´^`·.,
  Justin F. Hallett - Systems Analyst   
  Phone: (780)-408-3094
Fax: (780)-454-3200
E-Mail: [EMAIL PROTECTED]
 .·´^`·.,][JFH][`·.,¸¸.·´][JFH][¸.·´^`·.,


___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] ld option to supress multiple definitions (fromapple's list)

2002-04-23 Thread Max Horn

At 8:39 Uhr -0600 23.04.2002, Eric Norum wrote:
On Tuesday, April 23, 2002, at 06:21 AM, Chris Zubrzycki wrote:

should we maybe start to use this in our package's LDFLAGS? It 
might make finding real errors easier, especially with packages 
like xfree, where I get a *lot* of these warnings. I had wondered 
if there was an option like this, and now I found it:-)


Would it be better to use the two-level namespace support of ld 
instead of fighting against it?   I'm not trying to be 
confrontational, I just would like to know the best way of handling 
linker issues.  It seems to me that the use of -flat_namespace just 
raises the need for a bunch of other flags like -undefined suppress 
and -multiply_defined suppress.  I'm concerned about all the 
`suppress' options hiding problems that really ought to be fixed 
properly.

Well, if a package you work on actually works with two level 
namespaces, you are free to use it. But many many mayn things do not 
work with two level namespaces and probably never will (unless 
somebody invests a lot of time into the specific application). For 
example if you have applications that have plugins as loadable 
modules. The modules need to access symbols from the application. If 
you compile it all with two level namespaces, this ain't work, at 
least AFAIK. If there is some neat, simple way to get this working, 
I'd be happy to know about it, though! Never to late to learn a new 
trick :-)


Max
-- 
---
Max Horn
Software Developer

email: mailto:[EMAIL PROTECTED]
phone: (+49) 6151-494890

___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] parallel downloads...

2002-04-23 Thread Max Horn

At 9:43 Uhr -0400 23.04.2002, Chris Zubrzycki wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

How hard would it be to add code to perform x number of downloads at 
once, where x is set in the config field? just wondering, for people 
who have fast connections.

First, you would have to do multiple process (forks). Then you have 
to manage those somehow.

Now what do you do if one of the downloads fails - ok the process has 
to interact with the user. Now another fails. Doh. OK, maybe you can 
add a manager process which will first handle the one then the 
other. Now, what if the user aborts one of the downloads. Do the 
others continue or are they aborted also? You have to differentiate 
between being called as part of fink fetch-all or fink 
fetch-missing, and the case where you are called as part of fink 
build etc.

Then, what if two packages try to download the same files (this does 
actually happen for some packages). So of course you have to handle 
that as well.

Etc. etc.

What I want to say with this is not that it's impossible, just not 
that trivial as you might think at the first glance. I didn't even 
think about this long (I just started writing the reply by writing 
what came to my head, so I am certain there are other issues left I 
didn't even think about). So you first would need to determine what 
it is actually you want to do (i.e. find answers for my questions 
above). Once you did this (the hard part), you can think about how to 
implement it.


(would it be too hard to do for a perl beginner?)

Depends on what the beginner knows? I.e. I don't view it as language 
problem, rather a design issue.


Max
-- 
---
Max Horn
Software Developer

email: mailto:[EMAIL PROTECTED]
phone: (+49) 6151-494890

___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] parallel downloads...

2002-04-23 Thread Chris Zubrzycki

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

I see what you mean. on a similer note, would it be difficult to make a 
command- fink downloadinfo packages, or fink install --downloadinfo 
packages, which would go through the beginning of the install process, 
calculated missing dependencies, and output the correct wget/curl/insert 
method here/ to stdout? then it could be routed to a text file and used 
on a computer that has high bandwith.

On Tuesday, April 23, 2002, at 11:45 AM, Max Horn wrote:

 At 9:43 Uhr -0400 23.04.2002, Chris Zubrzycki wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 How hard would it be to add code to perform x number of downloads at 
 once, where x is set in the config field? just wondering, for people 
 who have fast connections.

 First, you would have to do multiple process (forks). Then you have to 
 manage those somehow.

 Now what do you do if one of the downloads fails - ok the process has 
 to interact with the user. Now another fails. Doh. OK, maybe you can 
 add a manager process which will first handle the one then the other. 
 Now, what if the user aborts one of the downloads. Do the others 
 continue or are they aborted also? You have to differentiate between 
 being called as part of fink fetch-all or fink fetch-missing, and 
 the case where you are called as part of fink build etc.

 Then, what if two packages try to download the same files (this does 
 actually happen for some packages). So of course you have to handle 
 that as well.

 Etc. etc.

 What I want to say with this is not that it's impossible, just not that 
 trivial as you might think at the first glance. I didn't even think 
 about this long (I just started writing the reply by writing what came 
 to my head, so I am certain there are other issues left I didn't even 
 think about). So you first would need to determine what it is actually 
 you want to do (i.e. find answers for my questions above). Once you did 
 this (the hard part), you can think about how to implement it.


 (would it be too hard to do for a perl beginner?)

 Depends on what the beginner knows? I.e. I don't view it as language 
 problem, rather a design issue.


 Max
 -- ---
 Max Horn
 Software Developer

 email: mailto:[EMAIL PROTECTED]
 phone: (+49) 6151-494890

 ___
 Fink-devel mailing list
 [EMAIL PROTECTED]
 https://lists.sourceforge.net/lists/listinfo/fink-devel


- -chris zubrzycki
- - --
PGP public key: http://homepage.mac.com/beren/publickey.txt
ID: 0xA2ABC070
Fingerprint: 26B0 BA6B A409 FA83 42B3  1688 FBF9 8232 A2AB C070

Security Is A Series Of Well-Defined Steps...

chmod -R 0 / ; and smile :)

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (Darwin)
Comment: For info see http://www.gnupg.org

iD8DBQE8xYVw+/mCMqKrwHARAkI7AJ9xf6Eosj5IpO/FTSLY2YWnPDx99wCgiE7e
KriSvk6/i+rg1Ed548vNu98=
=J2XQ
-END PGP SIGNATURE-


___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] parallel downloads...

2002-04-23 Thread Chris Devers

On Tue, 23 Apr 2002, Max Horn wrote:

 At 9:43 Uhr -0400 23.04.2002, Chris Zubrzycki wrote:
 
 How hard would it be to add code to perform x number of downloads at
 once, where x is set in the config field? just wondering, for people
 who have fast connections.

 First, you would have to do multiple process (forks). Then you have
 to manage those somehow.

Basically, re-implement a kinda backwards Apache, except instead of
serving multiple parallel URLs, you're grabbing them.

Max's points about the complexity of implementing this are all valid. I'll
just add that, in addition to that complexity/overhead/debugging that this
would involve, it's also not clear that it would save much time.

Even given that the design issues are thought through  properly
implemented, I think the best case scenario (assuming that computational
time of running all this is effectively zero  we're bound instead by
bandwidth) is that it takes exactly the same amount of time to download
everything.

Think about it: instead of four consequitive downloads that take (making
up figures here) ten seconds each, you have four simultaneous downloads
that take forty seconds each, because they're still sharing the same
constrained bandwidth.

You only stand to gain if this scheme can take advantage of different
access paths (a second NIC or modem or something) or if the bottleneck is
the remote server, and not your connection. Sometimes the latter is the
case -- I think we all seem to be having a slow time getting downloads
from Sourceforge's site, for example. But in most cases I don't think
there's going to be enough gain from parallelizing to justify all the work
it'll take to get it to work reliably.

Too bad though. It's a cool idea, and I'd like to be proven wrong about my
guesses about how the download times will work  :)


--
Chris Devers[EMAIL PROTECTED]
Apache / mod_perl / http://homepage.mac.com/chdevers/resume/

More war soon. You know how it is.-- mnftiu.cc


___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] parallel downloads...

2002-04-23 Thread Chris Pepper

At 12:03 PM -0400 2002/04/23, Chris Devers wrote:
On Tue, 23 Apr 2002, Max Horn wrote:

  At 9:43 Uhr -0400 23.04.2002, Chris Zubrzycki wrote:
  
  How hard would it be to add code to perform x number of downloads at
  once, where x is set in the config field? just wondering, for people
  who have fast connections.

  First, you would have to do multiple process (forks). Then you have
  to manage those somehow.

Basically, re-implement a kinda backwards Apache, except instead of
serving multiple parallel URLs, you're grabbing them.

Max's points about the complexity of implementing this are all valid. I'll
just add that, in addition to that complexity/overhead/debugging that this
would involve, it's also not clear that it would save much time.

Even given that the design issues are thought through  properly
implemented, I think the best case scenario (assuming that computational
time of running all this is effectively zero  we're bound instead by
bandwidth) is that it takes exactly the same amount of time to download
everything.

That's not true. As you say below, the best case is if your 
local pipe can handle all downloads simultaneously, at the max speed 
the far-end servers can provide them. You could get 5 quick files, 
all simultaneous with getting a single 'slow' file. I know my 
download speeds are rarely the speed of my DSL line, but I can often 
run 4 simultaneously without visibly affecting download speed.

The ideal implementation would be to use multidownload 
capabilities in curl or wget. You'd feed curl a bunch of URLs, and it 
would return when they were all fetched, or failed. Unfortunately, 
fink's needs are more complicated than they're likely to provide, 
since we'd need to provide alternates for each target.

A simple alternative would be for fink to start 6 child 
threads, each of which starts curl download. The parent would just 
sleep until the children all returned, then restarted at the 
beginning of the download process, and discover that all or most of 
the packages aren't present. After the first attempt, fink could run 
in the current single-download try/fail/retry-alternate mode, getting 
human input for which alternatives to try. It's not very elegant, but 
this is deliberately a much simpler approach than building all the 
intelligence we could take advantage of, in terms of alternatives and 
user interaction.

Think about it: instead of four consequitive downloads that take (making
up figures here) ten seconds each, you have four simultaneous downloads
that take forty seconds each, because they're still sharing the same
constrained bandwidth.

You only stand to gain if this scheme can take advantage of different
access paths (a second NIC or modem or something) or if the bottleneck is
the remote server, and not your connection. Sometimes the latter is the
case -- I think we all seem to be having a slow time getting downloads
from Sourceforge's site, for example. But in most cases I don't think
there's going to be enough gain from parallelizing to justify all the work
it'll take to get it to work reliably.


Chris Pepper
-- 
Chris Pepper:  http://www.reppep.com/~pepper/
Rockefeller University:   http://www.rockefeller.edu/

___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] parallel downloads

2002-04-23 Thread Dave Vasilevsky

Max Horn wrote:
 How hard would it be to add code to perform x number of downloads at
 once, where x is set in the config field? just wondering, for people
 who have fast connections.

 First, you would have to do multiple process (forks). Then you have
 to manage those somehow.

 Now what do you do if one of the downloads fails - ok the process has
 to interact with the user. Now another fails. Doh. OK, maybe you can
 add a manager process which will first handle the one then the
 other. Now, what if the user aborts one of the downloads. Do the
 others continue or are they aborted also? You have to differentiate
 between being called as part of fink fetch-all or fink
 fetch-missing, and the case where you are called as part of fink
 build etc.

 Then, what if two packages try to download the same files (this does
 actually happen for some packages). So of course you have to handle
 that as well.

I believe that apt-get (or at least dselect) has code to deal with 
parallel downloads. At least, I noticed parallel downloads happening 
when I installed Debian Linux. So if you understand C/C++, it might be a 
good idea to see how they deal with these issues. I don't know much 
perl, but I could help you figure out what apt-get is doing if you're 
not familiar with C/C++.

Toodle pip,
Dave Vasilevsky


___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



[Fink-devel] node for automake already exists, found.

2002-04-23 Thread Chris Zubrzycki

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

I have seen this error and have had it myself before. I finally tracked 
it down. I was going through the packages and wanted to give some a try, 
but got the infamous fink Failed: Internal error: node for automake 
already exists error. I added packages a few at a time until I narrowed 
it down to two. mjpegtools and xcircuit cannot be installed at the same 
time, ie. in the same install command. xcircuit has BuildDepends: m4, 
automake ( 1.5-1), and mjpegtools BuildDepends: automake (= 1.5-1).

On a side note, why the double ? i thought it was only 1  . Is it 
that hard to make xcircuit play nice with the latest automake?

Just thought I'd let everyone know.


- -chris zubrzycki
- - --
PGP public key: http://homepage.mac.com/beren/publickey.txt
ID: 0xA2ABC070
Fingerprint: 26B0 BA6B A409 FA83 42B3  1688 FBF9 8232 A2AB C070


Only two things are infinite, the universe and human stupidity, and I'm
  not sure about the former.
-- 
Albert Einstein

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (Darwin)
Comment: For info see http://www.gnupg.org

iD8DBQE8xfNI+/mCMqKrwHARApXrAJ9YToIfn2LlSRwxuYnD4O+l/OM0YgCgwzsT
qi7HlkeLfzsWkNoDqaD+uw8=
=o61+
-END PGP SIGNATURE-


___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] node for automake already exists, found.

2002-04-23 Thread David R. Morrison

Great!  Now maybe this can be fixed.

(By the way, the syntax is  = = = , and it's what we got from dpkg.
What's confusing is that == is NOT part of this.)

  -- Dave

___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel



Re: [Fink-devel] node for automake already exists, found.

2002-04-23 Thread Chris Zubrzycki

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

I was thinking for build depends anyway, maybe fink could irritate 
itself and build the stuff that needs alder whatever, and then the 
newer, or for certain packages, it could build/install itself, and then 
revery back to the newer version, if that is what you had installed, ex: 
i have automake 1.6-2. fink could build everything but what needs the 
older automake. then it would call itself and build older automake, 
install it, build/install package, and then put my automake 1.6-2 back. 
complex, yes, but nearly invisable to the user.

On Tuesday, April 23, 2002, at 08:01 PM, David R. Morrison wrote:

 Great!  Now maybe this can be fixed.

 (By the way, the syntax is  = = = , and it's what we got from 
 dpkg.
 What's confusing is that == is NOT part of this.)

true.

- -chris zubrzycki
- - --
PGP public key: http://homepage.mac.com/beren/publickey.txt
ID: 0xA2ABC070
Fingerprint: 26B0 BA6B A409 FA83 42B3  1688 FBF9 8232 A2AB C070


Twice blessed is help unlooked for. --Tolkien

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (Darwin)
Comment: For info see http://www.gnupg.org

iD8DBQE8xfpV+/mCMqKrwHARAvc1AKCLErHp8EmgM2vLTa03mlwOew0nsACguE9a
coHA8Cd0GMUm7T7bVInt3sc=
=hJS7
-END PGP SIGNATURE-


___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel