Re: [Fis] No, this is not the reason.

2018-06-03 Thread Mark Johnson
Dear Krassimir and Sungchul,

I suppose this bears out Von Neumann's tongue-in-cheek advice to
Shannon! (http://www.eoht.info/page/Neumann-Shannon+anecdote)

Krassimir, just to ask about Boltzmann's use of the logs... I first
understood this to be a measure of the probability distribution of a
whole thermodynamic system which factorises into the product of
probabilities of microstates in the system. Hence the logs (and hence
Shannon equating of "microstate" for "alphabet", which seems
reasonable at first glance)...

EXCEPT I very much like the explanation that Bob Ulanowicz gives here
(in http://www.mdpi.com/2078-2489/2/4/624) - which doesn't mention the
factorising of the probabilities of microstates, but instead argues
that -log (p(i)) gives a value for what isn't there (the "apophatic",
"absence"), and Bob's criticism of Shannon for inverting this by
turning his H into a measure of surprise:

"Boltzmann described a system of rarefied, non-interacting particles
in probabilistic fashion. Probability theory quantifies the degree to
which state i is present by a measure, p(i). Conventionally, this
value is normalized to fall between zero and one by dividing the
number of times that i has occurred by the total number of
observations. Under this “frequentist” convention, the probability of
i not occurring becomes (1 − p(i)). Boltzmann’s genius, however, was
in abjuring this conventional measure of non-occurrence in favor of
the negative of the logarithm of pi.  (It should be noted that
−log(p(i)) and (1 − p(i)) vary in uniform fashion, i.e., a one-to-one
relationship between the two functions exists). His choice imposed a
strong asymmetry upon matters. Conventionally, calculating the average
nonbeing in the system using (1 − p(i)) results in the symmetrical
parabolic function (p(i) − p(i)^2). If, however, one calculates
average absence using Boltzmann’s measure, the result becomes skewed
towards smaller p(i) (or larger [1 − p(i)]), i.e., towards nonbeing."

It's such a useful equation, and I agree, "Why are the logs there?" is
an important question.

Best wishes,

Mark

On 3 June 2018 at 20:22, Krassimir Markov  wrote:
> Dear Sung,
>
> You wrote:
>> I think the main reason that we express 'information'  as a logarithmic
> function of the number of choices available, n, may be because the human
> brain finds it easier to remember (and communicate and reason with)  10
> than  100, or 100 than 10. . . . 0, etc.
>>
>
> No, this is not the reason.
> The correct answer is that Shannon assume the n=0 as possible !!!
> Because of this, to avoid dividing by zero he used log(s).
> But this is impossible and many years the world works with log(s) not
> understanding why !
>
> log(s) is(are) no needed.
>
> It is more clear and easy to work without log(s) :=)
>
> Friendly greetings
> Krassimir
>
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Is information physical? A logical analysis

2018-05-25 Thread Mark Johnson
on, I
would like to suggest a logical analysis of the problem based on our
intrinsic and often tacit assumptions.

   To great extent, our possibility to answer the question “Is information
physical? “ depends on our model of the world. Note that here physical
means the nature of information and not its substance, or more exactly, the
substance of its carrier, which can be physical, chemical biological or
quantum. By the way, expression “quantum information” is only the way of
expressing that the carrier of information belongs to the quantum level of
nature. This is similar to the expressions “mixed numbers” or “decimal
numbers”, which are only forms or number representations and not numbers
themselves.

  If we assume that there is only the physical world, we have, at first, to
answer the question “Does information exist? “ All FISers assume that
information exists. Otherwise, they would not participate in our
discussions. However, some people think differently (cf., for example,
Furner, J. (2004) Information studies without information).

   Now assuming that information exists, we have only one option, namely,
to admit that information is physical because only physical things exist.
   If we assume that there are two worlds - information is physical, we
have three options assuming that information exists:
- information is physical
- information is mental
- information is both physical and mental

Finally, coming to the Existential Triad of the World, which comprises
three worlds - the physical world, the mental world and the world of
structures, we have seven options assuming that information exists:
- information is physical
- information is mental
- information is structural
- information is both physical and mental
- information is both physical and structural
- information is both structural and mental
- information is physical, structural and mental

The solution suggested by the general theory of information tries to avoid
unnecessary multiplication of essences suggesting that information (in a
general sense) exists in all three worlds but … in the physical world, it
is called *energy*, in the mental world, it is called *mental energy*, and
in the world of structures, it is called *information* (in the strict
sense). This conclusion well correlates with the suggestion of Mark Johnson
that information is both physical and not physical only the general theory
of information makes this idea more exact and testable.
   In addition, being in the world of structures, information in the strict
sense is represented in two other worlds by its representations and
carriers. Note that any representation of information is its carrier but
not each carrier of information is its representation. For instance, an
envelope with a letter is a carrier of information in this letter but it is
not its representation.
   Besides, it is possible to call all three faces of information by the
name energy - physical energy, mental energy and structural energy.

   Finally, as many interesting ideas were suggested in this discussion,
may be Krassimir will continue his excellent initiative combining the most
interesting contributions into a paper with the title
  *Is
information physical?*
   and publish it in his esteemed Journal.

   Sincerely,
   Mark Burgin

On 5/11/2018 3:20 AM, Karl Javorszky wrote:

Dear Arturo,





There were some reports in clinical psychology, about 30 years ago, that
relate to the question whether a machine can pretend to be a therapist.
That was the time as computers could newly be used in an interactive
fashion, and the Rogers techniques were a current discovery.

(Rogers developed a dialogue method where one does not address the contents
of what the patient says, but rather the emotional aspects of the message,
assumed to be at work in the patient.)



They then said, that in some cases it was indistinguishable, whether a
human or a machine provides the answer to a patient's elucidations.



Progress since then has surely made possible to create machines that are
indistinguishable in interaction to humans. Indeed, what is called "expert
systems ", are widely used in many fields. If the interaction is rational,
that is: formally equivalent to a logical discussion modi Wittgenstein, the
difference in: "who arrived at this answer, machinery or a human", becomes
irrelevant.



Artistry, intuition, creativity are presently seen as not possible to
translate into Wittgenstein sentences. Maybe the inner instincts are not
yet well understood. But!: there are some who are busily undermining the
current fundamentals of rational thinking. So there is hope that we shall
live to experience the ultimate disillusionment,  namely that humans are a
combinatorial tautology.



Accordingly, may I respectfully express opposing views to what you state:
that machines and humans are of incompatible builds. There are hints that
as

Re: [Fis] Is information physical? 'Signs rust.'

2018-04-26 Thread Mark Johnson
Dear Joseph,

Thank you for this beautiful summary.

That describes the world doesn't it? (it also describes music - which is a good 
sign). 

I want to say why information matters to me, not to argue about what it is. 

Information matters because it enables these conversations which dissolve 
barriers between disciplines, and ultimately has the capacity to dissolve 
barriers between each of us.

Information is such a powerful concept because everyone thinks they know what 
it is. Really, the conversation is the important thing. We may think we argue, 
but we are all in this dance together. It's always a privilege to have one's 
certainties shattered - who'd have thought the information in email messages 
could be so powerful?!

Best wishes,

Mark

-Original Message-
From: "joe.bren...@bluewin.ch" 
Sent: ‎26/‎04/‎2018 15:33
To: "u...@umces.edu" 
Cc: "fis@listas.unizar.es" 
Subject: Re: [Fis] Is information physical? 'Signs rust.'

Information refers to changes in patterns of energy flow, some slow (frozen), 
some fast, some quantitative and measurable, some qualitative and 
non-measurable, some meaningful and some meaningless, partly causally effective 
and partly inert, partly present and partly absent, all at the same time.

Best wishes,

Joseph

>Message d'origine
>De : u...@umces.edu
>Date : 25/04/2018 - 08:14 (PDT)
>À : mbur...@math.ucla.edu
>Cc : fis@listas.unizar.es
>Objet : Re: [Fis] Is information physical?
>
>Dear Mark,
>
>I share your inclination, albeit from a different perspective.
>
>Consider the two statements:
>
>1. Information is impossible without a physical carrier.
>
>2. Information is impossible without the influence of that which does not 
>exist.
>
>There is significant truth in both statements.
>
>I know that Claude Shannon is not a popular personality on FIS, but I
>admire how he first approached the subject. He began by quantifying,
>not information in the intuitive, positivist  sense, but rather the
>*lack* of information, or "uncertainty", as he put it. Positivist
>information thereby becomes a double negative -- any decrease in
>uncertainty.
>
>In short, the quantification of information begins by quantifying
>something that does not exist, but nonetheless is related to that
>which does. Terry calls this lack the "absential", I call it the
>"apophatic" and it is a major player in living systems!
>
>Karl Popper finished his last book with the exhortation that we need
>to develop a "calculus of conditional probabilities". Well, that
>effort was already underway in information theory. Using conditional
>probabilities allows one to parse Shannon's formula for diversity into
>two terms -- on being positivist information (average mutual
>information) and the other apophasis (conditional entropy).
>
>
>This duality in nature is evident but often unnoticed in the study of
>networks. Most look at networks and immediately see the constraints
>between nodes. And so it is. But there is also indeterminacy in almost
>all real networks, and this often is disregarded. The proportions
>between constraint and indeterminacy can readily be calculated.
>
>What is important in living systems (and I usually think of the more
>indeterminate ecosystems, rather than organisms [but the point applies
>there as well]) is that some degree of conditional entropy is
>absolutely necessary for systems sustainability, as it provides the
>flexibility required to construct new responses to novel challenges.
>
>While system constraint usually abets system performance, systems that
>become too efficient do so by decreasing their (mutually exclusive)
>flexibility and become progressively vulnerable to collapse.
>
>The lesson for evolutionary theory is clear. Survival is not always a
>min/max (fitt*est*) issue. It is about a balance between adaptation
>and adaptability. Ecosystems do not attain maximum efficiency. To do
>so would doom them.
> The balance also
>puts the lie to a major maxim of economics, which is that nothing
>should hinder the efficiency of the market. That's a recipe for "boom
>and bust". 
>
>Mark, I do disagree with your opinion that information cannot be
>measured. The wider application of information theory extends beyond
>communication and covers the information inherent in structure, or
>what John Collier calls "enformation". Measurement is extremely
>important there. Perhaps you are disquieted by the relative nature of
>information measurements. Such relativity is inevitable. Information
>can only be measured with respect to some (arbitrary) reference
>distribution (which is also known in the wider realm of thermodynamics
>as "the third law".)
>
>Remember how Bateson pointed to the overwhelmingly positivist nature
>of physics. Classical physics is deficient in its 

Re: [Fis] Is information physical?

2018-04-25 Thread Mark Johnson
Dear Lou and Mark,

Thanks for this - it is very important.

A quick question: why does it have to one or the other? Does the law of the
excluded middle apply to information? Why can't it be both?

As a way of extending this, can I suggest that the boundary between the
physical and the non-physical is between constraint A and constraint B?
It's likely that my boundary between A and B is not the same as your
boundary. My transduction process which maintains my boundary is not the
same as your transduction process. But it may well be that our transduction
processes are intertwined - like when we talk about it and try to agree
what "information" is.

As for knowledge in a textbook, the function of objects in the process of
teaching (what's that?) is certainly not as simple as the mere appearance
of textbooks would suggest: a textbook isn't a "knowledge pill". There are
related questions: what does a conductor do to an orchestra? What is the
relation of the score to what occurs? Where is the performance?

Best wishes,

Mark





On 25 April 2018 at 05:52, Louis H Kauffman  wrote:

> Dear Mark,
> Thank you for suggesting this topic.
> I concur wholeheartedly with your stand on this matter.
> Information in the sense that you indicate
> is pattern that is independent of the particular substrate on which it is
> ‘carried’.
>
> There is a persistent myth in popular scientific culture that mathematics
> and the physical are identical.
> Just as information is not physical, neither is mathematics.
> Each mathematical structure is recognizable as mathematics in that it is
> strictly relational and quite independent of the medium in which it is
> expressed.
>
> The example of mathematics as information independent of substrate
> is an opening for exploring more deeply the nature of information. For we
> are all aware
> of the remarkable interplay of mathematics and the quantitative and
> structural understanding of the physical.
>
> I suspect that the end result of that exploration will be for us to admit
> that
> we do not know know what is physical,
> that we can deny that information is not physical.
>
> The crux of the matter (sic)
> lies in the distinction made between the physical and the non-physical.
> There is such a distinction.
> The boundary of that distinction is unknown territory.
> Very best,
> Lou Kauffman
>
>
> On Apr 24, 2018, at 8:47 PM, Burgin, Mark  wrote:
>
> Dear Colleagues,
>
> I would like to suggest the new topic for discussion
>
>   Is information physical?
>
> My opinion is presented below:
>
>
>Why some people erroneously think that information is physical
>
>The main reason to think that information is physical is the strong
> belief of many people, especially, scientists that there is only physical
> reality, which is studied by science. At the same time, people encounter
> something that they call information.
>When people receive a letter, they comprehend that it is information
> because with the letter they receive information. The letter is physical,
> i.e., a physical object. As a result, people start thinking that
> information is physical. When people receive an e-mail, they comprehend
> that it is information because with the e-mail they receive information.
> The e-mail comes to the computer in the form of electromagnetic waves,
> which are physical. As a result, people start thinking even more that
> information is physical.
>However, letters, electromagnetic waves and actually all physical
> objects are only carriers or containers of information.
>To understand this better, let us consider a textbook. Is possible to
> say that this book is knowledge? Any reasonable person will tell that the
> textbook contains knowledge but is not knowledge itself. In the same way,
> the textbook contains information but is not information itself. The same
> is true for letters, e-mails, electromagnetic waves and other physical
> objects because all of them only contain information but are not
> information. For instance, as we know, different letters can contain the
> same information. Even if we make an identical copy of a letter or any
> other text, then the letter and its copy will be different physical objects
> (physical things) but they will contain the same information.
>Information belongs to a different (non-physical) world of knowledge,
> data and similar essences. In spite of this, information can act on
> physical objects (physical bodies) and this action also misleads people who
> think that information is physical.
>One more misleading property of information is that people can measure
> it. This brings an erroneous assumption that it is possible to measure only
> physical essences. Naturally, this brings people to the erroneous
> conclusion that information is physical. However, measuring information is
> essentially different than measuring physical quantities, i.e., 

Re: [Fis] Is Dataism the end of classical hypothesis-driven research and the beginning of data-correlation-driven research?

2018-03-13 Thread Mark Johnson
Hi Alex,

Yes I agree about intuition. We should understand it better, and
that's best done with some kind of practice with it. For you it's
yoga; for me it's music. I suspect they're very closely related.  I
think it's dangerous, however, to disregard technology. Music is
highly technological: some of the earliest technologies we possess in
our museums are musical instruments. But they work to "tune the
intuition". This is what we need our computers to be (the word
"computer" is "com-putare" - "putare" is "to contemplate"). I suspect
the problem is the logic of the digital computer... music (and yoga?)
itself seems to work on the basis of some kind of analogue computation
(contemplation!).

Stafford Beer wrote this account of Ross Ashby's sudden decision to
join Von Foerster at the University of Illiinois. I found the
manuscript in Beer's archive in Liverpool. Beer was fascinated by
Ashby's decision. Ashby (who was quite a cold fish, by all accounts)
expressed his rationale for his decision...

"Late in 1960, a group of Heinz von Foerster’s friends were together
in the evening at Heinz’s home in Urbana, Illinois. A complicated
ballet ensued, the choreography of which I do not altogether remember.
At the precisely proper – the balletic -  moment, Heinz offered Ross a
Chair in BCL. He quietly accepted, without a moment’s pause, and asked
to telephone his wife back home in Bristol. It was the middle of the
night: thank goodness that the sun moves from East to West. Everyone
concerned was totally astonished – Mrs Ashby, I think I may say,
especially. And so he changed his life: for vitally important years
1961- - 1970, W Ross Ashby MD was Professor in the Department of
Biophysics and Electrical Engineering at the University of Illinois in
Urbana. It couldn’t have happened to a nicer psychiatrist.

We walked back alone together to the Faculty Club, where we had
adjacent rooms, across the campus under a full moon. We were strolling
quietly and relaxed. I told him that I was amazed at his instance
decisiveness. He asked me why. I talked about his scientific acumen,
his meticulous methodology, his exactitude: I had expected him to ask
for a year to consider, to evaluate the evidence for and against
emigration. Surely his response had been atypically irrational?

He stopped in his tracks and turned to me, and I shall never forget
his TEACHING me at that moment. No, he said calmly. Years of research
could not attain to certainty in a decision of this kind: the variety
of the options had been far too high. The most rational response would
be to notice that the brain is a self-organizing computer which might
be able to assimilate the variety, and deliver an output in the form
of a hunch. He had felt this hunch. He had rationally obeyed it. And
had there been no hunch, no sense of an heuristic process to pursue?
Ross shrugged: ‘then the most rational procedure would be to toss a
coin’."


Best wishes,

Mark

On 13 March 2018 at 07:38, Alex Hankey <alexhan...@gmail.com> wrote:
> Dear Mark and Alberto,
>
> Let me propose a radical new input.
> The Human intuition is far more
> powerful than anything anyone
> has previously imagined, except
> those who use it regularly.
>
> It can be strengthen by particular
> mental practices, well described
> in the literature of Yoga.
>
> Digital Computing machines are
> not capable of this, and although
> number crunching is a way for
> Technology to assist, it is no substitute
> for the highest levels of the human mind.
>
> Alex
>
>
> On 13 March 2018 at 01:10, Mark Johnson <johnsonm...@gmail.com> wrote:
>>
>> Dear Alberto,
>>
>> Thank you for this topic – it cuts to the heart of why we think the
>> study of information really matters, and most importantly, brings to
>> the fore the thorny issue of technology.
>>
>> It has become commonplace to say that our digital computers have
>> changed the world profoundly. Yet at a deep level it has left us very
>> confused and disorientated, and we struggle to articulate exactly how
>> the world has been transformed. Norbert Wiener once remarked in the
>> wake of cybernetics, “We have changed the world. Now we have to change
>> ourselves to survive in it”. Things haven’t got any easier in the
>> intervening decades; quite the reverse.
>>
>> The principal manifestation of the effects of technology is confusion
>> and ambiguity. In this context, it seems that the main human challenge
>> to which the topic of information has the greatest bearing is not
>> “information” per se, but decision. That, in a large part, depends of
>> hypothesis and the judgement of the human intellect.
>>
>> The reaction to confusion and ambiguity is that some people and most
>> institutio

Re: [Fis] Is Dataism the end of classical hypothesis-driven research and the beginning of data-correlation-driven research?

2018-03-12 Thread Mark Johnson
Dear Alberto,

Thank you for this topic – it cuts to the heart of why we think the
study of information really matters, and most importantly, brings to
the fore the thorny issue of technology.

It has become commonplace to say that our digital computers have
changed the world profoundly. Yet at a deep level it has left us very
confused and disorientated, and we struggle to articulate exactly how
the world has been transformed. Norbert Wiener once remarked in the
wake of cybernetics, “We have changed the world. Now we have to change
ourselves to survive in it”. Things haven’t got any easier in the
intervening decades; quite the reverse.

The principal manifestation of the effects of technology is confusion
and ambiguity. In this context, it seems that the main human challenge
to which the topic of information has the greatest bearing is not
“information” per se, but decision. That, in a large part, depends of
hypothesis and the judgement of the human intellect.

The reaction to confusion and ambiguity is that some people and most
institutions acquire misplaced confidence in making decisions about
“the way forwards”, usually invoking some new tool or device as a
solution to the problem of dealing with ambiguity (right now, it’s
blockchain and big data). We - and particularly our institutions -
remain allergic to uncertainty. To what extent is “data-ism” a
reaction to the confusion produced by technology? Von Foerster sounded
the alarm in the 1970s:

“we have, hopefully only temporarily, relinquished our responsibility
to ask for a technology that will solve existent problems. Instead we
have allowed existent technology to create problems it can solve.” (in
Von Foerster, H (1981) "Observing Systems")

With every technical advance, there is an institutional reaction. The
Catholic church reacted to printing; Universities reacted to the
microscope and other empirical apparatus; political institutions
reacted to the steam engine, and so on. Today it is the institution of
science itself which reacts to the uncertainty it finds itself in. In
each case, technology introduces new options for doing things, and the
increased uncertainty of choice between an increased number of options
means that an attenuative process must ensue as the institution seeks
to preserve its identity. Technology in modern universities is a
particularly powerful example: what a stupid use of technology to
reproduce the ancient practices of the “classroom” online?! How
ridiculous in an age of self-publishing that academic journals seek to
use technology to maintain the “scarcity” (and cost) of their
publications through paywalls? And what is it about machine learning
and big data (I'm struggling with this in a project I'm doing at the
moment - the machine learning thing is not all it's cracked up to be!)

Judgement and decision are at the heart of this. Technologies do not
make people redundant: it is the decisions of leaders of companies and
institutions who do that. Technology does not poison the planet;
again, that process results from ineffective global political
decisions. Technology also sits in the context for decision-making,
and as Cohen and March pointed out in 1971, the process of
decision-making about technology is anything but rational (see “The
Garbage Can Model of Organisational Decision-making”
https://www.jstor.org/stable/2392088). Today we see “Blockchain” and
“big data” in Cohen and March’s Garbage can. It is the reached-for
"existent technology which creates problems it can solve".

My colleague Peter Rowlands, who some of you know, puts the blame on
our current way of thinking in science: most scientific methodologies
are "synthetic" - they attempt to amalgamate existing theory and
manifest phenomena into a coherent whole. Peter's view is that an
analytic approach is required, which thinks back to originating
mechanisms. Of course, our current institutions of science make such
analytical approaches very difficult, with few journals prepared to
publish the work. That's because they are struggling to manage their
own uncertainty.

So I want to ask a deeper question: Effective science and effective
decision-making go hand-in-hand. What does an effective society
operating in a highly ambiguous and technologically abundant
environment look like? How does it use its technology for effective
decision-making? My betting is it doesn’t look anything like what
we’ve currently got!

Best wishes,

Mark

On 6 March 2018 at 20:23, Alberto J. Schuhmacher  wrote:
> Dear FIS Colleagues,
>
> I very much appreciate this opportunity to discuss with all of you.
>
> My mentors and science teachers taught me that Science had a method, rules
> and procedures that should be followed and pursued rigorously and with
> perseverance. The scientific research needed to be preceded by one or
> several hypotheses that should be subjected to validation or refutation
> through experiments designed and carried out in a laboratory. The 

Re: [Fis] Simple amswer: NOT!

2018-03-07 Thread Mark Johnson
You'll be amused by this on Pavlov and Kornosky by Heinz Von Foerster...

https://m.youtube.com/watch?v=BomO7pbSVNA

The message is that one must be careful where one draws one's distinctions 

What you call data-ism is a bit like the bell without the clapper ;-)

-Original Message-
From: "Karl Javorszky" 
Sent: ‎07/‎03/‎2018 21:00
To: "Krassimir Markov" 
Cc: "fis" ; "Alberto J. Schuhmacher" 

Subject: Re: [Fis] Simple amswer: NOT!

Dear Krassimir, 




Formalism is nice, but it can be unreasonable. 


Your example of 2 naked men on a beach can be made simpler by adding a dog, a 
sausage and a whistle. 


Person A (the dog's owner) sounds the whistle. The dog is apparently used to 
being fed and runs up, waging its tail.


Can 
1) person A think:
a) the dog thinks it will be fed, 
b) this other naked guy thinks I am introducing Pawlow to formal logic; 


2) person B think: 
a) the measure of intelligence is based on the number of repetitions of a 
stimulus until the conditioned reflex is established,
b) this other naked guy apparently teaching his dog to learn to listen to the 
whistle;


3) the dog think
a) food coming,
b) the connection between whistle and sausage is a secret that I have mastered, 
no other dog will ever figure out the mystery,
c) we dogs live in a world in which past and future exist; these are connected 
by the moment, which is the natural home  (in fact, so far, the only home) of 
formal logic;
d) poor humans have no fur and can not therefore think.


Do you think you can't navigate a crossing because you are not able to figure 
out what the drivers in the cars coming will think? Because they are not naked? 


It is time to start talking about what Pawlow said.  Maybe, after that we can 
start discussing what Gregor Mendel said. After that, one will cry Caramba!


Time slips by while we waste time. 


Karl 


Am 07.03.2018 21:10 schrieb "Krassimir Markov" :

Dear Alberto,
 
Let imagine that we are at the naturist beach, i.e. naked.
OK! 
You will see all what I am and I will se the same for you.
 
Well, will you know what I think or shall I know the same for you?
 
Simple answer: NOT!
 
No Data base may contain any data about my current thoughts and feelings.
Yes, the stupid part of humanity may be controlled by big data centers.
But all times it had been controlled. Nothing new.
 
The pseudo scientists may analyze data and may create tons of papers.
For such “production” there was and will exist corresponded more and more big 
cemeteries.
I had edited more than one thousand papers.
Only several was really very important and with great scientific value !!!
 
Collection of data is important problem and it will be such for ever.
But the greater problem for humanity is collection of money 
 
And the last cause the former!
And the last is many times more dangerous than former!
 
Do not worry of Data-ism!
Be worried of the Money-ism!
 
I will continue next week because this is my second post  ( Thanks to wisdom of 
Pedro who had limited Writing-letter-ism in our list! ).
 
Friendly greetings
Krassimir
 
 
 
 
 
 
From: Alberto J. Schuhmacher 
Sent: Tuesday, March 06, 2018 10:23 PM
To: fis 
Subject: [Fis] Is Dataism the end of classical hypothesis-driven research and 
the beginning of data-correlation-driven research?
 
Dear FIS Colleagues,
I very much appreciate this opportunity to discuss with all of you.
My mentors and science teachers taught me that Science had a method, rules and 
procedures that should be followed and pursued rigorously and with 
perseverance. The scientific research needed to be preceded by one or several 
hypotheses that should be subjected to validation or refutation through 
experiments designed and carried out in a laboratory. The Oxford Dictionaries 
Online defines the scientific method as "a method or procedure that has 
characterized natural science since the 17th century, consisting in systematic 
observation, measurement, and experiment, and the formulation, testing, and 
modification of hypotheses". Experiments are a procedure designed to test 
hypotheses. Experiments are an important tool of the scientific method.
In our case, molecular, personalized and precision medicine aims to anticipate 
the future development of diseases in a specific individual through molecular 
markers registered in the genome, variome, metagenome, metabolome or in any of 
the multiple "omes" that make up the present "omics" language of current 
Biology.
The possibilities of applying these methodologies to the prevention and 
treatment of diseases have increased exponentially with the rise of a new 
religion, Dataism, whose foundations are inspired by scientific agnosticism, a 
way of thinking that seems classical but applied to research, it hides a 
profound revolution.
Dataism arises from the recent human desire to collect and analyze data, data 
and more data, data 

Re: [Fis] A Paradox

2018-03-04 Thread Mark Johnson
 transducers to tweak,
how much, when and how long... and which ones to leave alone!

Best wishes,

Mark

On 4 March 2018 at 15:41, Loet Leydesdorff <l...@leydesdorff.net> wrote:

> Dear Mark,
>
> Can you, please, explain "transduction" in more detail? Perhaps, you can
> also provide examples?
>
> Best,
> Loet
>
>
> --
>
> Loet Leydesdorff
>
> Professor emeritus, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; http://www.leydesdorff.net/
> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
> Sussex;
>
> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
> Hangzhou; Visiting Professor, ISTIC,
> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>
> Visiting Fellow, Birkbeck <http://www.bbk.ac.uk/>, University of London;
> http://scholar.google.com/citations?user=ych9gNYJ=en
>
>
> -- Original Message --
> From: "Mark Johnson" <johnsonm...@gmail.com>
> To: "Loet Leydesdorff" <l...@leydesdorff.net>
> Cc: y...@pku.edu.cn; "FIS Group" <fis@listas.unizar.es>
> Sent: 3/4/2018 1:03:17 PM
> Subject: Re: [Fis] A Paradox
>
> Dear Loet, all,
>
> I agree with this. Our construction of reality is never that of a single
> system: there are always multiple systems and they interfere with each
> other in the way that you suggest. I would suggest that behind all the
> ins-and-outs of codification or information and meaning is a very simple
> principle of transduction. I often wonder if Luhmann’s theory isn’t really
> that different from Shannon’s (who talks about transduction endlessly). The
> fact that you've made this connection explicit and empirically justifiable
> is, I think, the most important aspect of your work. You may disagree, but
> if we kept transduction and jettisoned the rest of Luhmann's theory, I
> think we still maintain the essential point.
>
> There is some resonance (interesting word!) with McCulloch’s model of
> perception, where he considered “drome” (literally, “course-ing”,
> “running”) circuits each bearing on the other: http://vordenker.de/
> ggphilosophy/mcculloch_heterarchy.pdf (look at the pictures on pages 2
> and 3) Perception, he argued was a *syn-*drome: a combination of
> inter-effects between different circuits. There is a logic to this, but it
> is not the logic of set theory. McCulloch wrote about it. I think it’s not
> a million miles away from Joseph’s/Lupasco’s logic.
> Best wishes,
>
> Mark
>
> On 4 March 2018 at 07:03, Loet Leydesdorff <l...@leydesdorff.net> wrote:
>
>>
>> Dear Xueshan Yan,
>>
>> May I suggest moving from a set-theoretical model to a model of two (or
>> more) helices. The one dimension may be the independent and the other the
>> dependent variable at different moments of time. One can research this
>> empirically; for example, in bodies of texts.
>>
>> In my own models, I declare a third level of codes of communication
>> organizing the meanings in different directions. Meaning both codes the
>> information and refers to horizons of meaning being specifically coded.
>>
>> Might this work as an answer to your paradox?
>>
>> Best,
>> Loet
>>
>> --
>>
>> Loet Leydesdorff
>>
>> Professor emeritus, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)
>>
>> l...@leydesdorff.net ; http://www.leydesdorff.net/
>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>> Sussex;
>>
>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>> Hangzhou; Visiting Professor, ISTIC,
>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>
>> Visiting Fellow, Birkbeck <http://www.bbk.ac.uk/>, University of London;
>> http://scholar.google.com/citations?user=ych9gNYJ=en
>>
>>
>> -- Original Message --
>> From: "Xueshan Yan" <y...@pku.edu.cn>
>> To: "FIS Group" <fis@listas.unizar.es>
>> Sent: 3/4/2018 2:17:01 AM
>> Subject: Re: [Fis] A Paradox
>>
>> Dear Dai, Søren, Karl, Sung, Syed, Stan, Terry, and Loet,
>>
>> I am sorry to reply you late, but I have thoroughly read every post about
>> the paradox and they have brought me many inspirations, thank you. Now I
>> offer my responses as follows:
>>
>> Dai, metaphor research is an ancient topic in linguistics, which reveals
>> the relationship between tenor and vehicle, ground and figure, target and

Re: [Fis] A Paradox

2018-03-04 Thread Mark Johnson
Dear Loet, all,

I agree with this. Our construction of reality is never that of a single
system: there are always multiple systems and they interfere with each
other in the way that you suggest. I would suggest that behind all the
ins-and-outs of codification or information and meaning is a very simple
principle of transduction. I often wonder if Luhmann’s theory isn’t really
that different from Shannon’s (who talks about transduction endlessly). The
fact that you've made this connection explicit and empirically justifiable
is, I think, the most important aspect of your work. You may disagree, but
if we kept transduction and jettisoned the rest of Luhmann's theory, I
think we still maintain the essential point.

There is some resonance (interesting word!) with McCulloch’s model of
perception, where he considered “drome” (literally, “course-ing”,
“running”) circuits each bearing on the other:
http://vordenker.de/ggphilosophy/mcculloch_heterarchy.pdf (look at the
pictures on pages 2 and 3) Perception, he argued was a *syn-*drome: a
combination of inter-effects between different circuits. There is a logic
to this, but it is not the logic of set theory. McCulloch wrote about it. I
think it’s not a million miles away from Joseph’s/Lupasco’s logic.
Best wishes,

Mark

On 4 March 2018 at 07:03, Loet Leydesdorff  wrote:

>
> Dear Xueshan Yan,
>
> May I suggest moving from a set-theoretical model to a model of two (or
> more) helices. The one dimension may be the independent and the other the
> dependent variable at different moments of time. One can research this
> empirically; for example, in bodies of texts.
>
> In my own models, I declare a third level of codes of communication
> organizing the meanings in different directions. Meaning both codes the
> information and refers to horizons of meaning being specifically coded.
>
> Might this work as an answer to your paradox?
>
> Best,
> Loet
>
> --
>
> Loet Leydesdorff
>
> Professor emeritus, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; http://www.leydesdorff.net/
> Associate Faculty, SPRU, University of
> Sussex;
>
> Guest Professor Zhejiang Univ. ,
> Hangzhou; Visiting Professor, ISTIC,
> Beijing;
>
> Visiting Fellow, Birkbeck , University of London;
> http://scholar.google.com/citations?user=ych9gNYJ=en
>
>
> -- Original Message --
> From: "Xueshan Yan" 
> To: "FIS Group" 
> Sent: 3/4/2018 2:17:01 AM
> Subject: Re: [Fis] A Paradox
>
> Dear Dai, Søren, Karl, Sung, Syed, Stan, Terry, and Loet,
>
> I am sorry to reply you late, but I have thoroughly read every post about
> the paradox and they have brought me many inspirations, thank you. Now I
> offer my responses as follows:
>
> Dai, metaphor research is an ancient topic in linguistics, which reveals
> the relationship between tenor and vehicle, ground and figure, target and
> source based on rhetoric. But where is our information? It looks like Syed
> given the answer: "Information is the container of meaning." If I
> understand it right, we may have this conclusion from it: Information is
> the carrier of meaning. Since we all acknowledge that sign is the carrier
> of information, the task of our Information Science will immediately become
> something like an intermediator between Semiotics (study of sign) and
> Semantics (study of meaning), this is what we absolutely want not to see.
> For a long time, we have been hoping that the goal of Information Science
> is so basic that it can explain all information phenomenon in the
> information age, it just like what Sung expects, which was consisted of
> axioms, or theorems or principles, so it can end all the debates on
> information, meaning, data, etc., but according to this view, it is very
> difficult to complete the missions. Syed, my statement is "A grammatically
> correct sentence CONTAINS information rather than the sentence itself IS
> information."
>
> Søren believes that the solution to this paradox is to establish a new
> discipline which level is more higher than the level of Information Science
> as well as Linguistics, such as his Cybersemiotics. I have no right to
> review your opinion, because I haven't seen your book Cybersemiotics, I
> don't know its content, same as I don't know what the content of
> Biosemiotics is, but my view is that Peirce's Semiotics can't dissolve this
> paradox.
>
> Karl thought: "Information and meaning appear to be like key and lock."
> which are two different things. Without one, the existence of another will
> lose its value, this is a bit like the paradox about hen and egg. I don't
> know how to answer this point. However, for your "The text may be an
> information for B, while it has no information value for A. The difference
> 

Re: [Fis] The unification of the theories of information based on thecateogry theory

2018-02-12 Thread Mark Johnson
Dear Karl,

You've communicated *your* kaleidoscope rather wonderfully. Thank you!

I shall look into it...

Best wishes,

Mark

-Original Message-
From: "Karl Javorszky" <karl.javors...@gmail.com>
Sent: ‎12/‎02/‎2018 14:36
To: "Mark Johnson" <johnsonm...@gmail.com>
Cc: "fis" <fis@listas.unizar.es>
Subject: Re: [Fis] The unification of the theories of information based on 
thecateogry theory

Kaleidoscope, Wittgenstein
 
Dear Mark,
 
thank you for your two questions. 
1)  Kaleidoscope
The term “kaleidoscope” is used to signify a complex thing that gives different 
pictures. The toy appears to produce an unlimited number of different pictures 
to the casual user. In fact, there is a maximal number of different pictures 
that can be produced, although this may not be immediately evident to every 
child.
The term kaleidoscope was used to draw your attention to the manifold pictures 
that natural numbers generate when – as a collection – reordered. The diversity 
of pictures is indeed truly impressive. One may naively assume that there is an 
endless number of variations that can appear. This is but a subjective 
impression. In fact, if we deal with a limited number of distinguishable 
objects – which we, for convenience’s sake, enumerate -, there can appear only 
a limited number of different arrangements among these. 
How to generate cycles of expressions of (a,b) is as follows:
a)   Maximal numbers of elements in the kaleidoscope
We know that the optimal size – for information transmission purposes – for a 
collection is 136 elements, of which around 66 carry significant symbols. 
Therefore, we know also that no more than about 15 describing dimensions can be 
utilised to exhaustively describe a collection of that many elements. 
(Collections with more than 140 elements cannot be described consistently at 
all.)  Please see: www.oeis.org/A242615.   
 
b)  Generating the sorted collection of arguments (a,b)
We generate (a,b) by setting up two loops: 
begin outer loop
 a:1,16;  /* why 16: see above */
write value a;
begin inner loop; 
b: a,16 ;
write value b;
end inner loop; 
end outer loop. /* This gives us a table with 136 rows and 2 columns */
Then we sort the collection two times, once on (a,b), once on (b,a). We note 
the sequential number of each of the elements in both of the sorting orders. 
These we use to generate the cycles we are interested in (which we later 
compare to other cycles, from other reorders, as we build a more advanced 
version of the kaleidoscope). We see in this example cycles that appear during 
reorders from  into . This classical introductory example and deictic 
definition is published in www.oeis.org/A235647. 
Please use this basic version of the kaleidoscope. One can add columns.
2)  Wittgenstein
Sitting in a snowy place and the Winter Olympics taking place right now, let me 
offer you my view of what Wittgenstein did in a parable about ski racing.
Philosophers are skiing athletes. Wittgenstein is a mediocre skier but a gifted 
mechanic. He introduces the concept of ski lifts to the sporting society. The 
ski lifts are a great invention and further the practice of skiing immensely. 
His co-athletes tell him, full of rightful indignation, that inventing, 
describing and operating a ski lift is not a sporting achievement, and falls 
definitely not under the term “skiing”. His results as an athlete are Zero.  He 
should be ashamed to try to tout a ski lift as a result of skiing. 
Wittgenstein, full of remorse, recants, agrees that ski lifts have nothing to 
do with the sport of skiing, and later in his life makes some irrelevant 
efforts of excellence in the sport sensu stricto.
 
Offering this audience of FIS participants:
a) a kaleidoscope which is exactly defined and delivers breath-taking pictures, 
b) an epistemological tool which generates undisputable facts about how <when, 
where, what and how much> are interdependent; these facts are of a numeric 
nature and root in a kind of arithmetic, so much simple, that there is a button 
on the screen of Excel for average users, enabling them to execute the 
procedure;
this suggestion is outside of the subjects the scientists in FIS are 
researching, like using a ski lift is outside of sport. 
Accounting is not science. Forensic accounting makes life easier if one likes 
precision and exactitude. If one is interested in how place, number, amount 
translate into each other, here is a tool to study the question. There is an 
accounting link connecting the concepts mentioned above. It is multi-faceted 
and needs familiarisation – just like a kaleidoscope. This kaleidoscope is made 
of numbers. Please risk the effort and take a look at it. If your accountant 
says: this is worth looking into, it is usually reasonable to actually dedicate 
some thought to the approach. 
 
 
 




2018-02-12 10:46 GMT+01:00 Mark Johnson <johnsonm...@gmail.

Re: [Fis] New Year Lecture

2018-01-10 Thread Mark Johnson
Dear John,

Thank you very much for this - a great way to start the new year!

I'd like to ask about "communication" - it's a word which is
understood in many different ways, and in the context of cells, is
hard to imagine.

When you suggest that “the unicellular state delegates its progeny to
interact with the environment as agents, collecting data to inform the
recapitulating unicell of ecological changes that are occurring.
Through the acquisition and filtering of epigenetic marks via meiosis,
fertilization, and embryogenesis, even on into adulthood, where the
endocrine system dictates the length and depth of the stages of the
life cycle, now known to be under epigenetic control, the unicell
remains in effective synchrony with environmental changes.” It seems
that this is not communication of ‘signs’ in the Peircean sense
supported by the biosemioticians (Hoffmeyer). But is it instead a
recursive set of transductions, much in the spirit of Bateson’s
insight that:

“Formerly we thought of a hierarchy of taxa—individual, family line,
subspecies, species, etc.—as units of survival. We now see a different
hierarchy of units—gene-in-organism, organism-in environment,
ecosystem, etc. Ecology, in the widest sense, turns out to be the
study of the interaction and survival of ideas and programs (i.e.,
differences, complexes of differences, etc.) in circuits.” (from his
paper "Pathologies of Epistemology" in Steps to an Ecology of Mind)

Recursive transduction like this is a common theme in cybernetics –
it's in Ashby's "Design for a Brain", Pask's conversation theory, and
in Beer’s Viable System Model, where “horizon scanning” (an
anticipatory sub-system gathering data from the environment) is an
important part of the metasystem which maintains viability of the
organism (It’s worth noting that Maturana and Varela's autopoietic
theory overlooks this).

"Communication" would then be much more like “conversation”…
etymologically, "con-versare"… "to turn together”… dancing! Does this
fit?

A further point is to then ask whether a logic of evolutionary biology
is a logic of recursive transductions over history. The critical point
is what Joseph Brenner argued before Christmas in objecting to Peirce:
we struggle to express the specificity and basis for change in our
logic. Do we need a different kind of logic?

Best wishes and Happy new year,

Mark

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Fw: Idealism and Materialism - and Empiricism

2017-11-08 Thread Mark Johnson
Dear Joseph,

This is great! I'm sympathetic to the view that a reconnection with physics is 
necessary and I  too worry about the political implication of Luhmann's ideas, 
powerful work though I find it. I've started reading Logic in Reality, but am 
finding it quite hard.

I have a question about "specificity" which relates to your "discontinuities". 
One of the central issues in physics is the broken symmetry of nature. In 
biological systems and art it manifests as fibonacci. In physics it may provide 
a link between relativity and quantum mechanical theory.  Is specificity a 
break in symmetry? 

The challenge is to speak of specificity (in time, space, matter) whilst 
maintaining that our speaking (the discourse) is part of a set of relations 
within which the specific is identified. I agree that Spencer Brown (and 
Luhmann) can't do this because he flattens the specific into a general process. 
Peirce I can't reject completely because his fascination with quaternions 
suggests that he was chasing a kind of spiral logic, which may be correct (but 
he didn't get there). How does Lupasco do this? Is there a better notation?

There are, for example, specific moments in the unfolding of a symphony where 
the symmetry is broken.  It gives rise to new ideas which would not have 
happened if the break was not there. There seems to be a logic to this.

Does Lupasco have a trick to articulate this? How does he avoid generalisation?

Best wishes,

Mark

-Original Message-
From: "Joseph Brenner" 
Sent: ‎08/‎11/‎2017 18:29
To: "fis" 
Cc: "javierwe...@gmail.com" 
Subject: [Fis] Fw:  Fw: Idealism and Materialism - and Empiricism

 
Dear Jose Javier,
 
Thank you very much for your constructive response to my note. I respect your 
view of Luhmann and his constructivism (?), which you have certainly correctly 
summarized in a few words. 
 
However, what the Lupasco theory of actuality and potentiality does is to offer 
some ontological basis for both, grounded in physics and is hence in my opinion 
hence worthy of some modicum of our attention. It is possible to talk about 
reality without the pretty little diagrams and calculus of Spencer-Brown.
 
Luhmann talks about the "constant interplay" between actual and potential, 
their ineinanderstehen, but there is no functional relation to the mundane 
properties of real physical systems. As Loet showed at the time, Luhmannian 
structures can be defined analytically, but that is not enough for me. And  a 
key point: why 'constant' interplay? Is there something wrong, or is it just 
too real, to include discontinuities as equally important as continuities?
 
It should be clear that I completely disagree with the place given to Luhmann 
in current thought. Luhmann perhaps deserves some historical credit for basing  
his theory on information. However, I follow Christian Fuchs who said in 2006 
that "The function of Luhmann's theory for society is that it is completely 
useless".
 
Society does not "contain" human beings: society is a group of human beings 
composed of individuals and the group and their contradictorial relations and 
dynamics. Luhmann stated that the "ground of being" is at the same time 
actuality and potentiality, but tells us nothing about their nature and rules 
for their evolution. Meaning cannot be a unity of actualization and 
potentialization (or re- and re-). In unity, the two lose their necessary 
specificity and basis for change. Luhmann took human beings as agents out of 
his system, and replaced them with abstractions. Fascist ideology is not far 
away.
 
If people would spend 1/20 the time on Lupasco that they do on Pierce and 
Luhmann, . . .
 
Best regards,
 
Joseph  
 
 
- Original Message - 
From: Jose Javier Blanco Rivero 
To: Joseph Brenner 
Sent: Wednesday, November 08, 2017 11:20 AM
Subject: Re: [Fis] Fw: Idealism and Materialism - and Empiricism


Dear Joseph,
Luhmann's concept of meaning (Sinn) is defined exactly as the unity of the 
difference between actuality and potentiality. Maybe there an answer can be 
found.
Besides, Luhmann's Sinn can also be translated as information since it regards 
redundancy and selection. Luhmann self referred to Sinn (which I'd rather to 
translate as sensemaking) as information processing. 
Best regards
El nov 8, 2017 6:59 AM, "Joseph Brenner"  escribió:

Dear Colleagues,
 
This is simply to register a dissenting opinion, for similar reasons, with the 
last two notes, if nothing else to say that there can be one:
 
1. Regarding John C.'s view of  the value of Pierce, there can be no common 
ground. Scholastic, propositional logic is part of the problem. His metaphysics 
has no ground in physics. Only Pierce's intuitions, to which he gives less 
value, have some value for me.
 
2. Koichiro presents some good science, but it is misapplied. Nothing tells us 
that information, or another complex 

Re: [Fis] What is ³Agent²?

2017-10-23 Thread Mark Johnson
Dear all,

There are some terms from physics which we use continually and assume
we all know what they mean. I'm taking my cue from Peter Rowland's
physics - see http://anpa.onl/pdf/S36/rowlands.pdf - in asking some
fundamental questions not only about information, but about physics
itself.

1. "Dimension" - what is a dimension? We are told in school that
height, width and depth are three "dimensions", or that time is a
fourth. At the same time, we understand that a value in one dimension
is called a "scalar", and that in two dimensions we have "vectors"
(and also in more dimensions).

2. "Vector" - this gets used in all sorts of contexts from cartography
to text analysis. But we have bivectors, trivectors, psuedovectors and
then the weird rotational asymmetry of quaternions, octonions, nonions
(see Peirce's work on these in the collected papers: his emphasis on
triadic forms seems to derive from his interest in quaternions). It's
important to be clear about what we mean by "vector".

4. "Matter" and "Mass" - do we mean "mass" when we say "matter"? It's
worth noting that mass is a scalar value.

5. "Energy" - isn't this a combination of mass, space and time? (e.g.
1/2mv^2) So... a scalar, a vector and time?

6. "Time" - Is time "real" in the same way as we might consider mass
to be real?... It is perhaps surprising that mass and energy are
connected: Nuclear reactors turn scalars into vectors! Is time
imaginary? is time i? That would make it a pseudoscalar.

7. "Conservation" - some things are conserved and other things aren't.
Time isn't conserved. Mass is. Energy is conserved. Space isn't
conserved, is it? Something weird happens with conservation...maybe
this is agency? Is information conserved?

8. "Information" - Shannon information involves counting things. On
the face of it, it's a scalar value - but in the counting process,
there is work done - both by the thing observed and by the body that
observes it. Work, like energy, is (at least) a combination of mass,
time and space. This applies to *any* counting: there is an imaginary
component, the dimensions of space and scalar mass. It probably
involves charge too.

9. "Agency" - Turning to Terry's definition of "agency", it involves
"work", "conservation" and "organisation". The definition hides some
complexities relating to the nature of work, and the ways in which
mass and charge might be conserved, but time and space isn't. Implicit
in the relation between extrinsic and intrinsic tendencies (what are
they?) is symmetry. Is agency a principle of conservation which
unfolds the symmetry between conserved and non-conserved dimensions?
That means we are in a symmetry: "a pattern that connects" - to quote
Bateson.

Personally, I find the value of these questions is that they render
less certain the dogmatically asserted principles of modern physics.
Maybe we need this uncertainty in order to get closer to
"information".

Best wishes,

Mark


On 23 October 2017 at 17:39, Bruno Marchal  wrote:
> Dear Gordana,
>
>
>
>
> On 20 Oct 2017, at 11:02, Gordana Dodig-Crnkovic wrote:
>
>
> Dear Terry, Bob, Loet
>
> Thank you for sharing those important thoughts about possible choices for
> the definition of agency.
>
> I would like to add one more perspective that I find in Pedro’s article
> which makes a distinction between matter-energy aspects and informational
> aspects of the same physical reality. I believe that on the fundamental
> level of information physics we have a good ND simplest example how those
> two entangled aspects can be formally framed.
> As far as I can tell, Terrys definition covers chemical and biological
> agency.
> Do we want to include apart from fundamental physics also full cognitive and
> social agency which are very much dominated by informational aspects
> (symbols and language)?
> Obviously there is no information without physical implementation,
>
>
>
> Hmm... I am not sure. Elementary arithmetic determines all semi-computable
> relative information state (with Oracles). So, with the numbers, once you
> accept the addition laws and the multiplication laws, information "grows"
> from inside, and consciousness differentiates.
> When the information get deeper and deeper, in Bennett sense of debth,
> dreams can stabilize and physical reality are "correctly" inferred, and
> eventually derived from arithmetic.
>
> That might not make your point below invalid.
>
> It is yet an important metaphysical point. The incompleteness theorem
> entails the existence of a sort of canonical information flux, or
> consciousness differentiation internal to elementary arithmetic, or
> elementary combinators, or to any universal machinery (universal in the
> mathematical Church-Turing-Post-Kleene sense).
>
> We can decide to consider the arithmetical beings being zombies, but this
> would entails a very special definition of matter to make it differ from the
> testable "arithmetical distribution".
>
> We can't have weak 

Re: [Fis] Fwd: Re: Verification of the Principle of Information Science--John Torday

2017-10-19 Thread Mark Johnson
I was thinking that these words from A.N. Whitehead's "Science and the
modern world" (1926) are highly relevant to our discussions:

"When you are criticising the philosophy of an epoch do not chiefly direct
your attention to those intellectual positions which its exponents feel it
necessary explicitly to defend. There will be some fundamental assumptions
which adherents of all the variant systems within the epoch unconsciously
presuppose. Such assumptions appear so obvious that people do not know what
they are assuming because no other way of putting things has ever occurred
to them. With these assumptions a certain limited number of types of
philosophic systems are possible, and this group of systems constitutes the
philosophy of the epoch" (p.61)

What assumptions are we blind to? From my own perspective, we assume an
education system and a science system which enables us to talk this kind of
talk. We rarely talk about the context which these systems create for us.
In order to get another "way of putting things", we should try see more
clearly the full gamut of constraints which bind us to our existing ways of
putting things.

Best wishes,

Mark

On 19 October 2017 at 14:54, Pedro C. Marijuan 
wrote:

> (Message from John Torday --Note: neither the list nor the server do
> accept attachments)
>
>  Mensaje reenviado 
> Asunto: Re: [Fis] Verification of the Principle of Information Science
> Fecha: Thu, 19 Oct 2017 06:45:07 -0700
> De: JOHN TORDAY  
> Para: Pedro C. Marijuan 
> 
>
> Dear All, I feel like the beggar at the banquet, having arrived at the FIS
> of late in response to Pedro's invitation to participate, having reviewed
> our paper on 'ambiguity' in Progress in Biolphyics and Molecular Biology
> (see attached). In my deconvolution of evolution as all of biology
> (Dobzhansky), I have reduced the problem to the unicellular state as the
> arbiter of information and communication, dictated by The First Principles
> of Physiology- negative entropy, chemiosmosis and homeostasis. I arrived
> at that idea by following the process of evolution as ontogeny and
> phylogeny backwards from its most complex to its simplest state as a
> continuum, aided by the concept that evolution is a series of
> pre-adaptations, or exaptations or co-options. With that mind-set, the
> formation of the first cell from lipids immersed in water generated
> 'ambiguity' by maintaining a negative entropic free energy within itself in
> defiance of the external positive energy of the physical environment, and
> the Second Law of Thermodynamics. The iterative resolution of that
> ambiguous state of being is what we refer to as evolution. For me,
> information and communication are the keys, but they are not co-equals. I
> say that because in reducing the question of evolution to the single cell,
> I have been able to 'connect the dots' between biology and physics, such
> elements of Quantum Mechanics as non-localization and the Pauli Exclusion
> Principle being the basis for pleiotropy, the distribution of genetics
> throughout the organism, and The First Principles of Physiology,
> respectively. So now, thinking about the continuum from physics to biology,
> literally, the Big Bang generated the magnitude and direction of both the
> Cosmos and subsequently biology, i.e. life is a verb not a noun, a process,
> not a thing. For these reasons I place communication hierarchically 'above'
> information. Moreover, this perspective offers answers to the perennial
> questions as to how and why life is 'emergent and contingent'. The
> emergence is due to the pleiotropic property, the organism having the
> ability to retrieve 'historic' genetic traits for novel purposes. And the
> contingence is on The First Principles of Physiology. So we exist between
> the boundaries of both deterministic Principles of Physiology and the Free
> Will conferred by homoestatic control, offering a range of set-points that
> may/not evolve when necessary, depending on the prevailing environmental
> conditions.
>
> And by the way, this way of thinking plays into Pedro's comments about the
> impact of such thinking on society because in conceiving of the cell as the
> first Niche Construction (see attached), all that I have said above plays
> out as the way in which organisms interact with one another and with their
> environment based on self-referential self-organization, which is the basis
> for consciousness, all emanating from the Big Bang as their point source.
> So with all due respect, Information is the medium, but communication is in
> my opinion the message, not the other way around. I see this as a potential
> way of organize information in a contextually relevant way that is not
> anthropocentric, but objective, approximating David Bohm's 'implicate
> order'. Ciao for now, I hopeJohn Torday
>
>
> On Thu, Oct 

Re: [Fis] Data - Reflection - Information

2017-10-15 Thread Mark Johnson
Dear Loet,

I mean to be analytical too. The Pythonesque nature of my questioning leads 
naturally to recursion: What is the meaning of meaning? There's a logic in the 
recursion - Peirce, Spencer-Brown, Leibnitz, Lou Kauffman... and you have 
probed this. 

Were you or I to be part of a recursive symmetry, how would we know? Where 
would the scientia be? How would we express our knowledge? In a journal? Why 
not in a symphony? (the musicologists miss the point about music: Schoenberg 
commented once on the musical graphs of Heinrich Schenker: "where are my 
favourite tunes? Ah! There.. In those tiny notes!")

I agree that operationalisation is important. But it can (and does) happen in 
ways other than those expressed in the content of discourse.  If this topic of 
"information" is of any value, it is because it should open our senses to that. 

Best wishes,

Mark

-Original Message-
From: "Loet Leydesdorff" <l...@leydesdorff.net>
Sent: ‎15/‎10/‎2017 07:17
To: "Mark Johnson" <johnsonm...@gmail.com>; "Terrence W. DEACON" 
<dea...@berkeley.edu>; "Sungchul Ji" <s...@pharmacy.rutgers.edu>
Cc: "foundationofinformationscience" <fis@listas.unizar.es>
Subject: Re[2]: [Fis] Data - Reflection - Information

Dear Mark:


Do we want to defend a definition of meaning which is tied to scientific 
practice as we know it? Would that be too narrow? Ours may not be the only way 
of doing science... 
I meant my remarks analytically. You provide them with a normative turn as 
defensive against alternative ways of doing science.


A non-discursive science might be possible - a science based around shared 
musical experience, or meditation, for example. Or even Hesse's 
"Glasperlenspiel"... Higher level coordination need not necessarily occur in 
language. Our communication technologies may one day give us new 
post-linguistic ways of coordinating ourselves. 
Why should one wish to consider this as science? One can make music together 
without doing science. Musicology, however, is discursive reasoning about these 
practices.


Codification is important in our science as we know it. But it should also be 
said that our science is blind to many things. Its reductionism prevents 
effective interdisciplinary inquiry, it struggles to reconcile practices, 
bodies, and egos, and its recent obsession with journal publication has 
produced the conditions of Babel which has fed the pathology in our 
institutions. There's less meaning in the academy than there was 50 years ago.
This is a question with a Monty Python flavor: what is the meaning of science? 
what is the meaning of life?


The implication is that our distinguishing between information and meaning in 
science may be an epiphenomenon of something deeper.
One can always ask for "something deeper". The answers, however, tend to become 
religious. I am interested in operationalization and design.


Best,
Loet




Best wishes,

Mark




From: Loet Leydesdorff
Sent: ‎14/‎10/‎2017 16:06
To: Terrence W. DEACON; Sungchul Ji
Cc: foundationofinformationscience
Subject: Re: [Fis] Data - Reflection - Information


Dear Terry and colleagues, 


"Language is rather the special case, the most unusual communicative adaptation 
to ever have evolved, and one that grows out of and depends on 
informationa/semiotic capacities shared with other species and with biology in 
general."
Let me try to argue in favor of "meaning", "language", and "discursive 
knowledge", precisely because they provide the "differentia specifica" of 
mankind. "Meaning" can be provided by non-humans such as animals or networks, 
but distinguishing between the information content and the meaning of a message 
requires a discourse. The discourse enables us to codify the meaning of the 
information at the supra-individual level. Discursive knowledge is based on 
further codification of this intersubjective meaning. All categories used, for 
example, in this discussion are codified in scholarly discourses. The 
discourse(s) provide(s) the top of the hierarchy that controls given the 
cybernetic principle that construction is bottom up and control top-down.


Husserl uses "intentionality" and "intersubjective intentionality" instead of 
"meaning". Perhaps, this has advantages; but I am not so sure that the 
difference is more than semantic. In Cartesian Meditations (1929) he argues 
that this intersubjective intentionality provides us with the basis of an 
empirical philosophy of science. The sciences do not begin with observations, 
but with the specification of expectations in discourses. A predator also 
observes his prey, but in scholarly discourses, systematic observations serve 
the update of codified (that is, theoretical) expectations.


Best,
Loet___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-14 Thread Mark Johnson
Dear Loet, 

When you say "distinguishing between the information content and the meaning of 
a message requires a discourse" this is, I think, a position regarding what 
scientific discourse does. There are, of course, competing descriptions of what 
scientific discourse does.  Does your "meaning" refers to the meaning of 
scientific discovery? Do we want to defend a definition of meaning which is 
tied to scientific practice as we know it? Would that be too narrow? Ours may 
not be the only way of doing science... 

A non-discursive science might be possible - a science based around shared 
musical experience, or meditation, for example. Or even Hesse's 
"Glasperlenspiel"... Higher level coordination need not necessarily occur in 
language. Our communication technologies may one day give us new 
post-linguistic ways of coordinating ourselves. 

Codification is important in our science as we know it. But it should also be 
said that our science is blind to many things. Its reductionism prevents 
effective interdisciplinary inquiry, it struggles to reconcile practices, 
bodies, and egos, and its recent obsession with journal publication has 
produced the conditions of Babel which has fed the pathology in our 
institutions. There's less meaning in the academy than there was 50 years ago.

The business of sense and reference which Terry refers to (and which provided a 
foundation for Husserl) is indeed problematic. Some forms of communication have 
only sense and yet there is coordination, emotion and meaning.  Peirce saw 
something different in the underlying symmetry of communication. This is in 
Bateson too (symmetrical/asymmetrical schizmogenesis).

It may be that it is symmetrical principles underpin quantum mechanical 
phenomena like entanglement; they certainly pervade biology. Medieval logicians 
may have seen this: Duns Scotus's ideas on "synchronic contingency" for 
example, mirror what quantum physicists are describing.

The implication is that our distinguishing between information and meaning in 
science may be an epiphenomenon of something deeper.

Best wishes,

Mark



-Original Message-
From: "Loet Leydesdorff" 
Sent: ‎14/‎10/‎2017 16:06
To: "Terrence W. DEACON" ; "Sungchul Ji" 

Cc: "foundationofinformationscience" 
Subject: Re: [Fis] Data - Reflection - Information

Dear Terry and colleagues, 


"Language is rather the special case, the most unusual communicative adaptation 
to ever have evolved, and one that grows out of and depends on 
informationa/semiotic capacities shared with other species and with biology in 
general."
Let me try to argue in favor of "meaning", "language", and "discursive 
knowledge", precisely because they provide the "differentia specifica" of 
mankind. "Meaning" can be provided by non-humans such as animals or networks, 
but distinguishing between the information content and the meaning of a message 
requires a discourse. The discourse enables us to codify the meaning of the 
information at the supra-individual level. Discursive knowledge is based on 
further codification of this intersubjective meaning. All categories used, for 
example, in this discussion are codified in scholarly discourses. The 
discourse(s) provide(s) the top of the hierarchy that controls given the 
cybernetic principle that construction is bottom up and control top-down.


Husserl uses "intentionality" and "intersubjective intentionality" instead of 
"meaning". Perhaps, this has advantages; but I am not so sure that the 
difference is more than semantic. In Cartesian Meditations (1929) he argues 
that this intersubjective intentionality provides us with the basis of an 
empirical philosophy of science. The sciences do not begin with observations, 
but with the specification of expectations in discourses. A predator also 
observes his prey, but in scholarly discourses, systematic observations serve 
the update of codified (that is, theoretical) expectations.


Best,
Loet___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-09 Thread Mark Johnson
Dear Bob,

In your Shannon Exonerata paper you have an example of three strings, their 
entropies and their mutual information. I very much admire this paper and 
particularly the critique  of Shannon and the emphasis on the apophatic, but 
some things puzzle me. If these are strings of a living thing, then we can 
assume that these strings grow over time. If sequences A,B and C are related, 
then the growth of one is dependent on the growth of the other. This process 
occurs in time. During the growth of the strings, even the determination of 
what is and is not surprising changes with the distinction between what is seen 
to be the same and what isn't.

 I have begun to think that it's the relative entropy between growing things 
(whether biological measurements, lines of musical counterpoint, learning) that 
matters. Particularly as mutual information is a variety of relative entropy. 
There are dynamics in the interactions. A change in entropy for one string with 
no change in entropy in the others (melody and accompaniment) is distinct from 
everything changing at the same time (that's "death and transfiguration"!). 

Shannon's formula isn't good at measuring change in entropy. It's less good 
with changes in distinctions which occur at critical moments ("aha! A 
discovery!" Or "this is no longer surprising") The best that we might do, I've 
thought, is segment your strings over time and examine relative entropies. I've 
done this with music. Does anyone have any other techniques?

On the apophatic, I can imagine a study of the dynamics of Ashby's homeostat 
where each unit produced one of your strings. The machine comes to its solution 
when the entropies of the dials are each 0 (redundancy 1) As the machine 
approaches its equilibrium, the constraint of each dial on every other can be 
explored by the relative entropies between the dials. If we wanted the machine 
to keep on searching and not settle, it's conceivable that you might add more 
dials into the mechanism as its relative entropy started to approach 0. What 
would this do? It would maintain a counterpoint in the relative entropies 
within the ensemble. Would adding the dial increase the apophasis? Or the 
entropy? Or the relative entropy? 

Best wishes,

Mark 

-Original Message-
From: "Robert E. Ulanowicz" <u...@umces.edu>
Sent: ‎09/‎10/‎2017 15:20
To: "Mark Johnson" <johnsonm...@gmail.com>
Cc: "foundationofinformationscience" <fis@listas.unizar.es>
Subject: Re: [Fis] Data - Reflection - Information


> A perspectival shift can help of the kind that Gregory Bateson once talked
> about. When we look at a hand, do we see five fingers or four spaces?
> Discourses are a bit like fingers, aren't they?

Mark,

The absence of the absent was a major theme of Bateson's, and he
criticized physics for virtually neglecting the missing.

Fortunately, IT is predicated on the missing, and quantitative information
is a double negative (the reverse of what is missing). This makes IT a
prime tool for broadening our scope on reality.
<https://people.clas.ufl.edu/ulan/files/FISPAP.pdf> &
<https://people.clas.ufl.edu/ulan/files/Reckon.pdf>

Bob

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-09 Thread Mark Johnson
Which "information paradigm" is not a discourse framed by the education system? 
The value of the discussion about information - circular though it appears to 
be  - is that we float between discourses. This is a strength. But it is also 
the reason why we might feel we're not getting anywhere!

A perspectival shift can help of the kind that Gregory Bateson once talked 
about. When we look at a hand, do we see five fingers or four spaces? 
Discourses are a bit like fingers, aren't they?

Mark


From: Terrence W. DEACON
Sent: ‎09/

-Original Message-
From: "Terrence W. DEACON" 
Sent: ‎09/‎10/‎2017 01:31
To: "Sungchul Ji" 
Cc: "foundationofinformationscience" 
Subject: Re: [Fis] Data - Reflection - Information

Against "meaning"


I think that there is a danger of allowing our anthropocentrism to bias the 
discussion. I worry that the term 'meaning' carries too much of a linguistic 
bias.
By this I mean that it is too attractive to use language as our archtypical 
model when we talk about information.
Language is rather the special case, the most unusual communicative adaptation 
to ever have evolved, and one that grows out of and depends on 
informationa/semiotic capacities shared with other species and with biology in 
general.
So I am happy to see efforts to bring in topics like music or natural signs 
like thunderstorms and would also want to cast the net well beyond humans to 
include animal calls, scent trails, and molecular signaling by hormones. And it 
is why I am more attracted to Peirce and worried about the use of Saussurean 
concepts.
Words and sentences can indeed provide meanings (as in Frege's Sinn - "sense" - 
"intension") and may also provide reference (Frege's Bedeutung - "reference" - 
"extension"), but I think that it is important to recognize that not all signs 
fit this model. Moreover, 


A sneeze is often interpreted as evidence about someone's state of health, and 
a clap of thunder may indicate an approaching storm.
These can also be interpreted differently by my dog, but it is still 
information about something, even though I would not say that they mean 
something to that interpreter. Both of these phenomena can be said to provide 
reference to something other than that sound itself, but when we use such 
phrases as "it means you have a cold" or "that means that a storm is 
approaching" we are using the term "means" somewhat metaphorically (most often 
in place of the more accurate term "indicates").


And it is even more of a stretch to use this term with respect to pictures or 
diagrams. 
So no one would say the a specific feature like the ears in a caricatured face 
mean something.
Though if the drawing is employed in a political cartoon e.g. with exaggerated 
ears and the whole cartoon is assigned a meaning then perhaps the exaggeration 
of this feature may become meaningful. And yet we would probably agree that 
every line of the drawing provides information contributing to that meaning.


So basically, I am advocating an effort to broaden our discussions and 
recognize that the term information applies in diverse ways to many different 
contexts. And because of this it is important to indicate the framing, whether 
physical, formal, biological, phenomenological, linguistic, etc.
For this reason, as I have suggested before, I would love to have a 
conversation in which we try to agree about which different uses of the 
information concept are appropriate for which contexts. The classic 
syntax-semantics-pragmatics distinction introduced by Charles Morris has often 
been cited in this respect, though it too is in my opinion too limited to the 
linguistic paradigm, and may be misleading when applied more broadly. I have 
suggested a parallel, less linguistic (and nested in Stan's subsumption sense) 
way of making the division: i.e. into intrinsic, referential, and normative 
analyses/properties of information. 


Thus you can analyze intrinsic properties of an informing medium [e.g. Shannon 
etc etc] irrespective of these other properties, but can't make sense of 
referential properties [e.g. what something is about, conveys] without 
considering intrinsic sign vehicle properties, and can't deal with normative 
properties [e.g. use value, contribution to function, significance, accuracy, 
truth] without also considering referential properties [e.g. what it is about].


In this respect, I am also in agreement with those who have pointed out that 
whenever we consider referential and normative properties we must also 
recognize that these are not intrinsic and are interpretation-relative. 
Nevertheless, these are legitimate and not merely subjective or nonscientific 
properties, just not physically intrinsic. I am sympathetic with those among us 
who want to restrict analysis to intrinsic properties alone, and who defend 

Re: [Fis] INFORMATION: JUST A MATTER OF MATH

2017-09-16 Thread Mark Johnson
Dear Arturo, all,

First of all, thank you to Pedro for exciting the list again - I was missing it!

I have sympathy with Arturo's position, not because I am a
mathematician (I'm not), but because I get tired of the "posturing"
that qualitative positions produce among academics. I work in
education, and education theory is full of this. Chomsky had a go at
Zizek and much postmodern social theory for this very reason:
https://www.youtube.com/watch?v=AVBOtxCfan0. He's got a point hasn't
he?

One of the exciting aspects of quantum mechanics is that some of what
we intuitively know about social life seems to be mirrored in the
quantum world and is expressible in mathematics. That this has some
empirical foundation upon which scientists can agree presents the
prospect of a deeper rethinking of a logic which might encompass a
broader spectrum of life and lived experience. This is not a new
dream: it is very similar to aims of the early cyberneticians who met
in the Macy hotel in the late 1940s.

However, progress towards this is hampered by a number of things.
1. The splits between classical mechanics and quantum mechanics, and
between quantum mechanics and relativity seem to arise from
irreconcilable originating perspectives. A colleague of mine at
Liverpool, Peter Rowlands has been hammering away at this for over 30
years (see 
https://www.amazon.co.uk/Foundations-Physical-Law-Peter-Rowlands/dp/9814618373/ref=sr_1_1?s=books=UTF8=1505562032=1-1=peter+rowlands+physical+law),
establishing a coherent mathematical description which unites
classical and quantum mechanics - but of course, such attempts often
meet with incomprehension by the physics community who have
established careers on the back of existing paradigms. There is a
human problem in addressing the physics problem!

2. The nature of mathematics and number itself is a question. It's a
very ancient question - I was delighted and surprised to learn that
John Duns Scotus worked out a logic of "superposition" in the 13th
century (he called it "synchronic contingency") see
https://www.amazon.co.uk/Philosophy-John-Duns-Scotus/dp/0748624627.
Maths is a discourse, like physics and sociology. If there wasn't
coordination between mathematicians about the symbols they use and
their meaning, there would be no maths. Curiously, neither would there
be maths if all the mathematicians in world perfectly agree on all
symbols and meaning! (there'd be nothing to talk about).

3. given point 2, to put maths before information is to invite the
challenge that maths is information (as discourse), and without
information there is no maths!

However, can we do better than "posturing". Yes, I think we can, and
this may well involve new empirical practices, but this requires a new
shared perspective. Maybe our approaching quantum computers will give
us this by making the weirdness of superposition, entanglements and
the inherent dynamic symmetry of the quantum world part of everyday
life...

Best wishes,

Mark

On 15 September 2017 at 14:16, tozziart...@libero.it
 wrote:
> Dear FISers,
> I'm sorry for bothering you,
> but I start not to agree from the very first principles.
>
> The only language able to describe and quantify scientific issues is
> mathematics.
> Without math, you do not have observables, and information is observable.
> Therefore, information IS energy or matter, and can be examined through
> entropies (such as., e.g., the Bekenstein-Hawking one).
>
> And, please, colleagues, do not start to write that information is
> subjective and it depends on the observer's mind. This issue has been
> already tackled by the math of physics: science already predicts that
> information can be "subjective", in the MATHEMATICAL frameworks of both
> relativity and quantum dynamics' Copenhagen interpretation.
> Therefore, the subjectivity of information is clearly framed in a TOTALLY
> physical context of matter and energy.
>
> Sorry for my polemic ideas, but, if you continue to define information on
> the basis of qualitative (and not quantitative) science, information becomes
> metaphysics, or sociology, or psychology (i.e., branches with doubtful
> possibility of achieving knowledge, due to their current lack of math).
>
>
> Arturo Tozzi
>
> AA Professor Physics, University North Texas
>
> Pediatrician ASL Na2Nord, Italy
>
> Comput Intell Lab, University Manitoba
>
> http://arturotozzi.webnode.it/
>
>
>
> Messaggio originale
> Da: "Pedro C. Marijuan" 
> Data: 15/09/2017 14.13
> A: "fis"
> Ogg: [Fis] PRINCIPLES OF IS
>
> Dear FIS Colleagues,
>
> As promised herewith the "10 principles of information science". A couple of
> previous comments may be in order.
> First, what is in general the role of principles in science? I was motivated
> by the unfinished work of philosopher Ortega y Gasset, "The idea of
> principle in Leibniz and the evolution of deductive theory" (posthumously
> published in 1958). 

Re: [Fis] Intelligence & Meaning & The Brain

2016-11-17 Thread Mark Johnson
Dear Pedro and List,

I've really enjoyed leading the session on "scientific communication"
- it was an opportunity, for which I am very grateful, to explore the
nature and purpose of scientific communication, and to experiment with
different ways of communicating. It's been a fascinating couple of
months - and now we have President Trump. Who'd have thought it?!

It wasn't so clear to me before the session, but during it I have
learnt that the arguments about constraint and redundancy relate
directly to the process of communicating through different media. When
we teach, much of what we do is add redundancy, or multiple
descriptions of things. Powerful forms of communication like music
tend to be the most redundant, and even in the prosody of language
where meaning is most powerfully conveyed, there are levels of
redundant descriptions in the tone, timbre, rhythm and pitch of the
human voice.

It was great to see the Searle-Floridi video that Marcus shared.
Searle is very entertaining and whilst his philosophy of "status
functions" is a way of explaining how meaning is constructed in
society, I've always felt that he only looks at the positive
"information" part of a declaration; but every declaration that "this
is a $5 bill" is also a declaration that something else is only a
worthless piece of paper. The $5 bill is worth $5 because it is
declared to be scarce - so a "status declaration" is a "scarcity
declaration".

Our academic world, including the publishing world, operates by
declaring of status of those who are deemed to have the authority to
make status declarations about knowledge. To put it crudely, we
academics love our titles and our status, and our universities have
built giant businesses out of it. Thorstein Veblen referred to it as
"atavistic" arguing "the standing of the savant in the mind of the
altogether unlettered is in great measure rated in terms of intimacy
with the occult forces" (Theory of the Leisure Class, final chapter).
He's right, isn't he?

But today we have a science of information - a science of uncertainty
revealed to us through the computer's lens. Yet we remain committed to
establishing findings on the pages of journals for ontological
foundations of the universe, biology, ecology, economics, etc as
"certain"; we write as if we have little doubt, that our methods are
sound, that we are an 'authority'. One has to really get to know a
professor in order to understand how they are not certain about things
- and then the learning really begins. This failure to express
uncertainty has to do with both status and the way we communicate.

So why does this matter? We turn to Trump: from an information
perspective, Trump's victory was a disaster for the pollsters. This is
a serious problem - not just for losing or winning elections - but for
auditing the effects of policies during government. The pollsters told
the government that on balance, using the appropriate measures, their
policies were working and things were getting better. But in reality,
people were really hurting and getting angry: what Stafford Beer calls
the "Algedonic loop" didn't work - there was a failure to analyse
anything to do with real feelings and experience. That this has
serious political consequences is now obvious.

So thank you all for participating and for a whole load of references
which I will explore. I hope that some of you will experiment in
adding redundancies to your scientific descriptions in new ways with
the amazing technical resources we now have at our disposal.

Best wishes,

Mark

On 17 November 2016 at 13:09, Pedro C. Marijuan
 wrote:
> Dear FIS Colleagues,
>
> Herewith the dropbox link to the Chengdu's presentation on Intelligence and
> the Information Flow (as kindly requested by Christophe and Gordana).
>
> https://www.dropbox.com/sh/wslnk41c3lquc55/AADpm_U6xuhm6jHK0esyN-29a?dl=0
>
> About the ongoing exchanges on language and meaning, there could be some
> additional arguments to consider:
>
> 1. Evolutionary origins of language (Terry can say quite a bit about that).
> It is difficult to establish a clear stage into which well formed oral
> language would have emerged. That the basis was both gestural (Susan Goldin
> Meadow) and emotional utterances seems to be more and more accepted. Alarm
> calls for instance in some monkeys contain distinct sound codes that clearly
> imply an associated meaning on what is the specific predator to take care of
> (aerial, felines, snakes) with differentiated behavioral escape responses in
> each case. Pretty more complex in human protolanguages.
> 2. Nervous Systems functioning. The action-perception cycle in advanced
> mammals would be the engine of information processing and meaning
> generation. The advancement of the life cycle would be the source and sink
> of the communicative exchanges and the ultimate reference for meaning. (This
> connects with the info flows and intelligence of my presentation).
> 3. Human 

Re: [Fis] Is quantum information the basis of spacetime?

2016-11-05 Thread Mark Johnson
Dear Moises and all,

Floridi has an excellent chapter in his "philosophy of information" called
"Against digital ontology". It's worth quoting the two fundamental
questions he asks about digital ontology:

"a. whether the physical universe might be adequately modelled digitally
and computationally, independently of whether it is actually digital and
computational in itself;

b. whether the ultimate nature of the physical universe might be actually
digital and computation in itself, independently of how it can be
effectively or adequately modelled." (Floridi, "Philosophy of Information",
p320)

My point is that this stuff is highly speculative. Of course, it might be
argued that "it from qbit" is fundamentally different from "it from bit".
But is it really? Quantum computers look rather like parallel processors,
don't they? Also the emphasis on relations rather than atoms (qbits) in the
article is interesting, but it looks like there is still an atomistic logic
behind it. It's the stuff of computer science - even if it's quantum
computer science.

I might struggle to see the point - even if I'm happy that physicists are
talking about information. If anybody was to communicate this in a way that
helps me see why this matters, they would probably have to amplify their
descriptions - in effect, add redundancy in their descriptions. In this
particular case, I think that would be very difficult.

Curiously, in the recent discussion on this list about the additional layer
of information in DNA (
http://www.sciencealert.com/scientists-confirm-a-second-layer-of-information-hiding-in-dna),
I think it would be easier to amplify the descriptions.

Best wishes,

Mark




On 5 November 2016 at 11:28, Moisés André Nisenbaum <
moises.nisenb...@ifrj.edu.br> wrote:

> Dear FISers.
>
> I was very excited with the John’s first message informing that a group of
> scientists is discussing again the role of Information in Physics.
>
>
> The high impact on FIS list of John’s post (13 replies from different
> persons in 2 days) shows that it is yet an open discussion. Thank you all
> for the very interesting posts :-)
>
> The works (not interdisciplinary nor reductionist) of Tom Stonier (1991),
> Holger Lyre (1995) and Carl Friedrich Von Weizsäcker, et. Al (2006) and
> many discussions on this list (http://fis.sciforum.net/fis-
> discussion-sessions/) are also about this theme.
>
>
> Scientific American article is an introduction. So I went to the source of
> the project named “It from Qubit: Simons Collaboration on Quantum Fields,
> Gravity, and Information.
>
> Home page: https://www.simonsfoundation.org/mathematics-and-physical-
> science/it-from-qubit-simons-collaboration-on-quantum-fields-gravity-and-
> information/
>
> Overview: http://web.stanford.edu/~phayden/simons/overview.pdf
>
> Project: http://web.stanford.edu/~phayden/simons/simons-proposal.pdf
>
>
>
> Mainly, it is an Interdisciplinary Resarch group trying to approximate
> Fundamental Physics from Quantum Information, so I think that it is a good
> and necessary initiative. Imagine what we can “extract” from this two
> fields working together!
>
>
>
> They have several projects, but I think that the final goals is not as
> important as the revelations of the processes. We should look at the
> projects. Maybe we can find that, after all, the title “it from qbit” was
> only a “marketing” (bad?) choice :-)
>
>
> Kind regards,
>
>
> Moisés
>
>
> References:
>
> STONIER, T. *Towards a new theory of information*. Journal of Information
> Science. *Anais*...1991Disponível em: http://www.scopus.com/inward/
> record.url?eid=2-s2.0-0026386595=tZOtx3y1
>
> “Information science is badly in need of an information theory. The paper
> discusses both the need, and the possibility of developing such a theory
> based on the assumption that information is a basic property of the
> universe.”
>
>
> LYRE, H. Quantum theory of Ur-objects as a theory of information. 
> *International
> Journal of Theoretical Physics*, v. 34, n. 8, p. 1541–1552, ago. 1995.
>
> “The quantum theory of ur-objects proposed by C. F. von Weizsäcker has to
> be interpreted as a quantum theory of information.”
>
>
> WEIZSÄCKER, C. F. VON; GÖRNITZ, T.; LYRE, H. *The structure of physics*. 
> Dordrecht:
> Springer, 2006.
>
> “the idea of a quantum theory of binary alternatives (the so-called
> ur-theory), a unified quantum theoretical framework in which spinorial
> symmetry groups are considered to give rise to the structure of space and
> time.”
>
> 2016-11-03 16:52 GMT-02:00 John Collier :
>
>> Apparently some physicists think so.
>>
>>
>>
>> https://www.scientificamerican.com/article/tangled-up-in-spa
>> cetime/?WT.mc_id=SA_WR_20161102
>>
>>
>>
>> John Collier
>>
>> Emeritus Professor and Senior Research Associate
>>
>> Philosophy, University of KwaZulu-Natal
>>
>> http://web.ncf.ca/collier
>>
>>
>>
>> ___
>> Fis mailing list
>> 

Re: [Fis] Information as a complex notion

2016-11-05 Thread Mark Johnson
Dear all,

Just a few quick comments to relate this current discussion back to
scientific communication:

1. Taking information seriously must entail taking "communicating what
we think about information" seriously - and exploring different ways
of communicating what we think.

2. When I made my videos for this discussion, all I did was add
redundancy through a richer channel of communication.

3. The high number of recent posts suggest to me that the 'comfort
zone' of the group is clearly in exchanging philosophical positions
about information using low bandwidth communication media (text).
Reflecting on how we talk to each other is less comfortable and more
difficult. However, the philosophical discussions seem go round in
circles (we have had them so often before) - why?

How would the debate look if more redundancy was added to the
communication? It would, I suggest, reveal more about the constraints
of different positions. I'm sure this could be empirically explored.

Best wishes,

Mark

On 5 November 2016 at 06:17, Emanuel Diamant  wrote:
> Dear Pedro,
>
> Dear FIS colleagues,
>
>
>
> Because our current discussion (dubbed “Scientific Communication”, announced
> by Pedro at Sept. 22, 2016) has deviated from its original purpose and
> shifted to our main and most relevant point of interest “What is
> information”, I dare to remind you about my personal views on the subject:
> Information is a complex notion. Like the notion of complex numbers in
> mathematics (which are composed of a real and imaginary parts), Information
> can be seen as composed of a real and an imaginary part – Physical
> information and Semantic Information. Physical information is a
> generalization of Shannon, Fisher, Kolmogorov, Chaitin, and as such
> informations. Semantic information still does not have its recognized
> definition. My attempts to spell out its destiny could be find on my site
> http://www.vidia-mant.info or at the Research Gate.
>
>
>
> Best regards,
>
> Emanuel.
>
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>



-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Visiting Professor
Far Eastern Federal University, Russia

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Scientific communication

2016-10-29 Thread Mark Johnson
Dear Michel,

I'm mindful that we're breaking the rules of the forum so I will
follow this up off-list, but I think this is worth mentioning to the
group.

The starting point is a diagram, or a sequence of diagrams - certainly
that's most appropriate for a systems theoretical approach like
Keen's. Can you draw some pictures to explain your understanding? With
a series of diagrams, a voice-over is easy to add. (Occasionally I
start with the voice, and add the pictures)

I also want to say that sometimes this process is a sticking point for
people, because the precision of diagrams is much more demanding than
the looseness of words: often when we start to draw something, we can
see weaknesses in our position. Then it can become a matter of 'fight
or flight': does one resist exposing potential weaknesses in one's
position, and retreat back into the world of the academic paper and
words, or does one draw the diagram anyway and acknowledge its
limitations and assumptions? I am clearly encouraging people to do the
latter, not least because I think exposing the limitations of a
position is the most important thing to communicate in an uncertain
world!

Best wishes,

Mark




On 29 October 2016 at 17:31, Michel Godron <migod...@wanadoo.fr> wrote:
> Dear Mark,
>
> It would certainly be interesting to prepare a video. But how to make it ?
>
> My contribution would be marginal : I can only explain (in french and you
> would translate, as Richard Forman did for Landscape Ecology) why the ideas
> of Keen are parallel with the main ecological models on the role of
> informatioo in biology (La vie est une transmission et une gestion de
> l'information qui permet à chaque être vivant et à chaque communaué d'êtres
> vivants - y compris l'humanité -  de survivre).
>
> Cordialement.
> M. Godron
> Le 29/10/2016 à 15:25, Mark Johnson a écrit :
>
> Dear Michel,
>
> Ok. Steve Keen has been close to Tony Lawson's work (he presented Minsky at
> Lawson's conference last year) - he's a supporter of his broad thesis,
> although obviously he's done more mathematical modelling which Lawson has
> some possibly valid objections to.  They've had some fascinating critical
> exchanges.
>
> This blog by Steve is interesting -
> http://www.goodreads.com/author_blog_posts/13775327-the-need-for-pluralism-in-economics
>
> Hayek is in the mix, as is perhaps a shared disdain for Stiglitz and co.
>
> I like Steve's work, and his prediction of the crisis is good, and (relevant
> to this discussion) I like the fact that his simulation tool Minsky is
> freely available for download here: https://sourceforge.net/projects/minsky/
>
> So, to come back to my original comment... It's possible to explain this
> stuff  with video, isn't it?...
>
> Best wishes,
>
> Mark
> 
> From: Michel Godron
> Sent: ‎29/‎10/‎2016 14:06
> To: Mark Johnson
> Subject: Re: [Fis] Scientific communication
>
> Dear Mark,
>
> You write : "I'm guessing you are thinking of the line of thought from
> Hayek to Stiglitz?" It is not at all my way. Among the economists I
> shoud be rather in agreement with Keen and Minsky (I could send some
> pages explaining (in french) why.
>
>   Cordialement. M. Godron
> Le 29/10/2016 à 14:22, Mark Johnson a écrit :
>> I'm guessing you are thinking of the line of thought from Hayek to
>> Stiglitz?
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>



-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Scientific communication

2016-10-29 Thread Mark Johnson
Dear Michel,

Ok. Steve Keen has been close to Tony Lawson's work (he presented Minsky at 
Lawson's conference last year) - he's a supporter of his broad thesis, although 
obviously he's done more mathematical modelling which Lawson has some possibly 
valid objections to.  They've had some fascinating critical exchanges.

This blog by Steve is interesting - 
http://www.goodreads.com/author_blog_posts/13775327-the-need-for-pluralism-in-economics

Hayek is in the mix, as is perhaps a shared disdain for Stiglitz and co. 

I like Steve's work, and his prediction of the crisis is good, and (relevant to 
this discussion) I like the fact that his simulation tool Minsky is freely 
available for download here: https://sourceforge.net/projects/minsky/

So, to come back to my original comment... It's possible to explain this stuff  
with video, isn't it?...

Best wishes,

Mark

-Original Message-
From: "Michel Godron" <migod...@wanadoo.fr>
Sent: ‎29/‎10/‎2016 14:06
To: "Mark Johnson" <johnsonm...@gmail.com>
Subject: Re: [Fis] Scientific communication

Dear Mark,

You write : "I'm guessing you are thinking of the line of thought from 
Hayek to Stiglitz?" It is not at all my way. Among the economists I 
shoud be rather in agreement with Keen and Minsky (I could send some 
pages explaining (in french) why.

  Cordialement. M. Godron
Le 29/10/2016 à 14:22, Mark Johnson a écrit :
> I'm guessing you are thinking of the line of thought from Hayek to 
> Stiglitz?

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Scientific communication

2016-10-29 Thread Mark Johnson
Dear Francesco and Michel,

I wonder if it would be possible to make a video explaining these different 
ideas?

I'm intrigued by Francesco's economics (particularly the Keynesian 
probability), and while I remain less confident than Michel that it has been 
"shown" that economics is about information (I'm guessing you are thinking of 
the line of thought from Hayek to Stiglitz?) it would be more compelling to see 
these ideas expressed in richer ways that dry academic papers. Maybe there's an 
important project here?

Best wishes,

Mark

-Original Message-
From: "Michel Godron" <migod...@wanadoo.fr>
Sent: ‎28/‎10/‎2016 23:27
To: "fis@listas.unizar.es" <fis@listas.unizar.es>
Subject: Re: [Fis] Scientific communication

Merci pour cette vision très large de ce qu'est l'économie. 

Au delà de la musique suggérée par Ilya Prigogine, il a maintenat été montré 
que l'économie, comme l'écologie, est un système de gestion de l'information 
qui donne des réactions pour maintenir le sytème en équilibre. Malheureusement, 
cette démonstration est esquissée en anglais seulement dans Landscape Ecology.  
 

Cordialement. 
M. Godron



Le 26/10/2016 à 16:07, Francesco Rizzo a écrit :

Caro Mark, 
non conosco il pensiero dell'economista che Tu mi indichi. cercherò di superare 
questa lacuna. Tuttavia, tra l'economia e la storia vi è una differenza di 
fondo: l'economia è una scienza mediatrice, la storia è una scienza 
federatrice. Alla domanda "Che cos'è l'economia?" si può rispondere in tanti 
modi. Per me l'economia è un pensiero che tende a realizzare il massimo 
risultano col minimo costo. Anch'io adotto la teoria della probabilità 
soggettiva di J. M. Keynes e ritengo che i sistemi economici siano fondati sui 
valori normali dal punto di vista soggettivo. Suggerisco inoltre, come ha fatto 
Ilya Prigogine, di assumere il paradigma della musica come base dell'intera 
scienza. Compresa quella economica. Tutta la mia vita è stata dedicata alla 
ricerca della "Nuova economia". Quindi è giusto comunicarlo, senza alcuna 
presunzione o superbia. Ho inventato davvero una una nuova concezione 
economica. Complimenti per la tua capacità comunicativa e auguri.
Un abbraccio.
Francesco


2016-10-26 13:21 GMT+02:00 Mark Johnson <johnsonm...@gmail.com>:

Dear Jose, Francisco and Pedro, (Pedro - please could you forward if
the server won't do it?)

First of all, thank you Jose for pointing out this news story. It's
interesting to reflect that Alan Sokal's hoax of 1996 (which is
similar) was specifically directed at a discourse which he deemed to
be unscientific (postmodernism). This one is a nuclear physics
conference and clearly, nobody cares about the science - this is
about money, status and ego: I'm not sure Sokal could see the full
extent of this in the 1990s.

Francisco, I agree with you about not tarring everything with the same
brush. On the other hand, I think it is important not to stop asking
fundamental questions, not least "What is economics?". Even great
economists like Hayek and Von Mises were not convinced about its
subject matter (they thought it should be "Catallactics" - the science
of exchange) - and they were even less convinced by the maths! I do
recommend Tony Lawson's work for a broader perspective on economic
history.

Pedro, thank you for a very elegant summary of the complexities of the
"science system". I like the study of the nature of information
because, rather like cybernetics, it digs away at the foundations of
things. There is of course a practical level where we publish papers
(which few read) and fall asleep (or get drunk) at conferences (!).
But I am arguing that what we think happens in the "brownian motion
chamber" of face-to-face communication isn't as impenetrable as we
might have thought (Bateson got this) , and that it is profoundly
connected not only to what we do with technology, but to the
pathologies of communication, marketisation and inauthenticity that
Sokal and others point to. This partly falls into the domain of the
phenomenologists (Alfred Schutz is important in covering this
territory), but also into the domain of artists who communicate in
powerfully in different kinds of ways. There's more work to do here.

As a very speculative contribution to this, I've done one more video
which is an attempt to summarise my argument and tie it to an example
of musical communication (a Bach fugue). Alfred Schutz wrote a
wonderful paper on music called "Making Music Together"
(https://www.jstor.org/stable/40969255 - Loet told me about this years
ago, and it's one of the few really great academic papers I know). I
don't mention Schutz in the video, but I do use John Maynard Keynes's
remarkable treatise on probability from 1921.

I argue that at the root of our communication practices lie
assumptions about 'counting' and 'similarity': we make assumptions
about t

Re: [Fis] Scientific communication

2016-10-26 Thread Mark Johnson
a" (2016). Quindi non mi viene facile leggere
>> taluni rilievi critici che non possono condividere perché non è giusto fare
>> di tutte le erbe un fascio.
>> Ho rispetto del pensiero degli altri, ma ritengo sempre opportuno mettere
>> i puntini sulle i.
>> Francesco
>>
>> 2016-10-21 14:33 GMT+02:00 Pedro C. Marijuan <pcmarijuan.i...@aragon.es>:
>>>
>>> Dear Mark and FIS colleagues,
>>>
>>> It was a pity that our previous replies just crossed in time, otherwise I
>>> would have continued along your thinking lines. However, your alternative
>>> focus on who has access to the "Brownian chamber motion" is pretty exciting
>>> too.
>>>
>>> Following our FIS colleague Howard Bloom ("The Global Brain", 2000),
>>> universities and the like are a social haven for a new type of personality
>>> that does not match very well within the social order of things. It is the
>>> "Faustian type" of mental explorers, dreamers, creators of thought, etc.
>>> Historically they have been extremely important but the way they are treated
>>> (even in those "havens" themselves!), well, usually is rather frustrating
>>> except for a few fortunate parties. A long list of arch-famous scientific
>>> figures ended very badly indeed.
>>>
>>> So, in this view, people "called to the box" are the Faustians of the
>>> locality... But of course, other essential factors impinge on the box
>>> composition and inner directions, often very rudely. SCIENTIA POTESTAS EST:
>>> it means that as the box's outcomes are so much influential in the
>>> technology, religion, culture, richness, prosperity, and military power,
>>> etc., a mixing of socio-political interests will impress a tough handling in
>>> the external guidance and inner contents of the poor box.
>>>
>>> And finally, the education --as you have implied-- that very often is
>>> deeply imbued with classist structures and class selection. The vitality of
>>> the Brownian box would most frequently hang from these educational
>>> structures --purses-- for both financing and arrival of new people. And that
>>> implies further administrative strings and been involved in frequent
>>> bureaucratic internecine conflicts. The book of Gregory Clark (2014, The Son
>>> also Raises) is an excellent reading on class "iron statistics" everywhere,
>>> particularly in education.
>>>
>>> E puor si muove! All those burdens have a balance of positive supporting
>>> and negative discouraging influences, different in each era. Perhaps far
>>> better in our times, but who knows... The good thing relating our discussion
>>> is that, from immemorial times, all those Brownian boxes around are
>>> wonderfully agitated and refreshed by the external communication flows of
>>> scientific publications via the multiple channels (explosive ones today,
>>> almost toxic for the Faustian).
>>>
>>> Maintaining a healthy, open-minded scientific system... easy said than
>>> done.
>>>
>>> Best regards
>>> --Pedro
>>>
>>>
>>>
>>>
>>>
>>> El 16/10/2016 a las 16:07, Mark Johnson escribió:
>>>
>>> Dear Pedro,
>>>
>>> Thank you for bringing this back down to earth again. I would like to
>>> challenge something in your first comment - partly because contained
>>> within it are issues which connect the science of information with the
>>> politics of publishing and elite education.
>>>
>>> Your 'bet' that "that oral exchange continues to be the central
>>> vehicle. It is the "Brownian Motion" that keeps running and infuses
>>> vitality to the entire edifice of science." is of course right.
>>> However, there is a political/critical issue as to who has ACCESS to
>>> the chamber with the Brownian motion.
>>>
>>> It is common for elite private schools in the UK (and I'm sure
>>> elsewhere) to say "exams aren't important to us. What matters are the
>>> things around the edges of formal education... character-building
>>> activities, contact with the elite, etc". What they mean is that they
>>> don't worry about exams because their processes of pre-selection and
>>> 'hot-housing' mean that all their students will do well in exams
>>> anyway. But nobody would argue that exams are not important for
>>> personal advancement in today's society

Re: [Fis] Scientific Communication and Publishing

2016-10-05 Thread Mark Johnson
y as if by right (where else would you go), are now forced to take
> an active role in order to maintain their preeminence in the new
> technological environment. They use their existing position to avert threats
> to their future control, through coordination  on policy, regulation and law
> (e.g. right of access to papers, brought into sharp focus by the tragic
> death of Aaron Swartz a couple of years ago).
>
> In a separate dynamic, technology is being used to manage these changes,
> which are themselves given impetus by the alignment of technology with
> managerial methods (Key Performance Indicators, etc), and with the business
> models of financialisation, privatisation and precarious employment.
>
> I don't think we will get to the bottom of these matters, still less change
> them, without engaging with the processes in a political way, however good
> our analysis of technology per se may be.
>
> Now I'll go off to check out Sci-Hub, ... or maybe I'll wait until I leave
> the office and get home.
>
> Dai
>
>
>>
>> 
>> From: Fis [fis-boun...@listas.unizar.es] on behalf of Loet Leydesdorff
>> [l...@leydesdorff.net]
>> Sent: 27 September 2016 08:27
>> To: 'Moisés André Nisenbaum'; 'Mark Johnson'; 'fis'
>>
>> Subject: Re: [Fis] Scientific Communication and Publishing
>>
>> Dear Mark, Moises, and colleagues,
>>
>> I agree that this is a very beautiful piece of work. The video is
>> impressive.
>>
>> My comment would focus on what it is that constructs reality "by language"
>> (p. 2). I agree with the remark about the risk of a linguistic fallacy; but
>> how is the domain of counterfactual expectations constructed? The answer in
>> the paper tends towards a sociological explanation: "status" for which one
>> competes in a new political economy. However, it seems to me that the
>> selection mechanism has to be specified. Can this be external to the
>> communication? How is the paradigmatic/epistemic closure and quality control
>> brought about by the communication? How is a symbolic layer shaped and
>> coded?
>>
>> One cannot reverse the reasoning: the editorial boards follow standards
>> that they perceive as relevant and can reproduce. The standards are not a
>> convention of the board since one would not easily agree. Reversing the
>> reasoning would bring us back to interests and thus to a kind of neo-marxism
>> a la the sociology of scientific knowledge (SSK). In actor-network theory
>> (ANT) the emergence of standards happens historically/evolutionarily, but is
>> not explained.
>>
>> I don't have answers on my side. But perhaps, the strength of anticipation
>> and the role of models needs to be explored. Models can be entertained and
>> enable us to reconstruct a knowledge-based reality.
>>
>> Best,
>> Loet
>>
>>
>> Loet Leydesdorff
>> Professor, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)
>> l...@leydesdorff.net ; http://www.leydesdorff.net/
>> Associate Faculty, SPRU, University of Sussex;
>> Guest Professor Zhejiang Univ., Hangzhou; Visiting Professor, ISTIC,
>> Beijing;
>> Visiting Professor, Birkbeck, University of London;
>> http://scholar.google.com/citations?user=ych9gNYJ=en
>>
>>
>> -Original Message-
>> From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Moisés André
>> Nisenbaum
>> Sent: Tuesday, September 27, 2016 1:45 AM
>> To: Mark Johnson
>> Cc: fis
>> Subject: Re: [Fis] Scientific Communication and Publishing
>>
>> Dear Mark.
>>
>> Thank you for the excelent video and article. It is very important to
>> discuss this and, if you agree, I will use your video with my students (can
>> you send me the transcription?).
>> No doubt we are in a changing world and we have to fight against abusive
>> processes, like publication industry.
>>
>> In Rafael's article, the question “what is a scientific journal in the
>> digital age?” I understand that we must think outside the box. I think it
>> would be great if some group invent a kind of "Uber" of scientific
>> production. Something that connect directly authors and readers at feasible
>> rates.  arXiv does this connection in some way, but it is not universal.
>> E-science is also a good initiative.
>>
>> Related to this discussion, UNESCO will do an event on Wednesday
>> (sep/28th) at Museu do Amanhã (Rio de Janeiro) called International Day
>> for Universal Access to Information (http://en.unesco.or

[Fis] Fwd: Scientific publication: Response

2016-09-30 Thread Mark Johnson
Dear FIS Colleagues,

Thank you very much for your comments. I've made a video response
which can be found here: https://youtu.be/r8T2ssGAius

The video mostly concerns Loet's comment about selection and
codification and references Sergej's point about "shared objects" (and
its relation to activity theory). Shared objects are extremely
important, but Francisco is right - Loet's point about codification
goes the heart of the matter.

In responding to Loet (and to some extent Sergej) I draw attention to
the nature of teaching and its distinction with communication. This
means standing back from Luhmann's binary model of communication,
which he saw as a contingency-reduction process in the selection of
meaning. Instead I suggest looking at communication as a process of
the revealing and coordination of constraints. In Loet's work, I think
this is probably the same as redundancy... Both Ashby and Von Foerster
are powerful reference points for a deeper understanding - notably Von
Foerster's paper "On self organising systems and their environments"
(see http://e1020.pbworks.com/f/fulltext.pdf) and Ashby's late work on
"constraint analysis" which was somewhat obscured in the hype around
second-order cybernetics. Ashby's notebooks are the best place to
start: http://www.rossashby.info/journal/index/index.html#constraint -
he later called this "cylindrance".

I agree with Moises about new ways of thinking about accrediting
intellectual contributions. Uber is very interesting  but it
remains centralized (with a company making huge profits in
California). What if it was peer-to-peer, or the record of
contributions was 'ownerless'. There is a lot of work going on at the
moment with regard to 'decentralise the web' (see
http://www.wired.co.uk/article/tim-berners-lee-reclaim-the-web). I
think this provides an valuable indicator of where we might look for
richer mechanisms of ascribing credit for intellectual work. I'm not
sure about Berners-Lee's Linked Data, but maybe http://ipfs.io has
potential. I think these technologies present the best chance of
transforming our market-oriented logic - so, Joseph, there is hope!

As for the history, I'm no historian unfortunately... but we could do
with some proper historical analysis of scientific communication,
status and power over the centuries. The parallels between the 16/17
centuries and our own time are compelling. I predict that our
universities will one day be transformed in their approach to
education to as great an extent that the Cambridge curriculum which
Bacon so harshly criticised in 1605 (The advancement of Learning -
https://en.wikipedia.org/wiki/The_Advancement_of_Learning) was
transformed by 1700.

I work for a medical faculty in Liverpool, and today I am at the Royal
College of Physicians in London, which was founded in 1518 (see
https://en.wikipedia.org/wiki/Royal_College_of_Physicians) They have
an extraordinary archive here, which raises more questions about how
scientists before the Royal Society communicated with one another. We
ought to get a better grip of the historical shift that occurred in
the 1660s so that we have a better understanding of what kind of shift
to expect in the years to come. As a side comment, I recommend looking
at T.S. Eliot's analysis of the transformation that occurred in poetry
in the same period - what he called a "dissociation of sensibility".

Best wishes and many thanks for your comments,

Mark

--
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com


-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Visiting Professor
Far Eastern Federal University, Russia

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread Mark Johnson
Dear all,

Is this a question about counting? I'm thinking that Ashby noted that Shannon 
information is basically counting. What do we do when we count something?

Analogy is fundamental - how things are seen to be the same may be more 
important than how they are seen to be different. 

It seems that this example of DNA is a case where knowledge advances because 
what was once thought to be the same (for example, perceived empirical 
regularities in genetic analysis) is later identified to be different in 
identifiable ways.

Science has tended to assume that by observing regularities, causes can be 
discursively constructed. But maybe another way of looking at it is to say what 
is discursively constructed are the countable analogies between events. 
Determining analogies constrains perception of what is countable, and by 
extension what we can say about nature; new knowledge changes that perception.

Information theory (Shannon) demands that analogies are made explicit - the 
indices have to be agreed. What do we count? Why x? Why not y? otherwise the 
measurements make no sense. I think this is an insight that Ashby had and why 
he championed Information Theory as analogous to his Law of Requisite Variety 
(incidentally, Keynes's Treatise on Probability contains a similar idea about 
analogy and knowledge). Is there any reason why the "relations of production" 
in a mechanism shouldn't be counted?  determining the analogies is the key 
thing isn't it?

One further point is that determining analogies in theory is different from 
measuring them in practice. Ashby's concept of cybernetics-as-method was: "the 
cyberneticist observes what might have happened but did not". There is a point 
where idealised analogies cannot map onto experience. Then we learn something 
new.

Best wishes,

Mark


-Original Message-
From: "Loet Leydesdorff" 
Sent: ‎09/‎06/‎2016 12:52
To: "'John Collier'" ; "'Joseph Brenner'" 
; "'fis'" 
Subject: Re: [Fis] Fw:  "Mechanical Information" in DNA

Dear colleagues, 
 
It seems to me that a definition of information should be compatible with the 
possibility to measure information in bits of information. Bits of information 
are dimensionless and “yet meaningless.” The meaning can be provided by the 
substantive system that is thus measured. For example, semantics can be 
measured using a semantic map; changes in the map can be measured as changes in 
the distributions, for example, of words. One can, for example, study whether 
change in one semantic domain is larger and/or faster than in another. The 
results (expressed in bits, dits or nits of information) can be provided with 
meaning by the substantive theorizing about the domain(s) under study. One may 
wish to call this “meaningful information”. 
 
I am aware that several authors have defined information as a difference that 
makes a difference (McKay, 1969; Bateson, 1973). It seems to me that this is 
“meaningful information”. Information is contained in just a series of 
differences or a distribution. Whether the differences make a difference seems 
to me a matter of statistical testing. Are the differences significant or not? 
If they are significant, they teach us about the (substantive!) systems under 
study, and can thus be provided with meaning in the terms of  studying these 
systems. 
 
Kauffman et al. (2008, at p. 28) define information as “natural selection 
assembling the very constraints on the release of energy that then constitutes 
work and the propagation of organization.” How can one measure this 
information? Can the difference that the differences in it make, be tested for 
their significance? 
 
Varela (1979, p. 266) argued that since the word “information” is derived from 
“in-formare,” the semantics call for the specification of a system of reference 
to be informed. The system of reference provides the information with meaning, 
but the meaning is not in the information which is “yet meaningless”. 
Otherwise, there are as many “informations” as there are systems of reference 
and the use of the word itself becomes a source of confusion.
 
In summary, it seems to me that the achievement of defining information more 
abstractly as measurement in bits (H = - Σ p log(p)) and the availability of 
statistics should not be ignored. From this perspective, information theory can 
be considered as another form of statistics (entropy statistics). A substantive 
definition of information itself is no longer meaningful (and perhaps even 
obscure): the expected information content of a distribution or the information 
contained in the message that an event has happened, can be expressed in bits 
or other measures of information.
 
Best,
Loet
 



Loet Leydesdorff 
Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net ; http://www.leydesdorff.net/ 
Associate Faculty, 

[Fis] _ Re: _ Re: _ FIS discusion

2016-05-01 Thread Mark Johnson
Dear FIS colleagues,

I'm sympathetic to Maxine's comment: "untethered observations and
meticulous descriptions are the cornerstone of any life science. One
is not out there trying to
make others as you want them to be, but attempting to know them as
they are. The task is precisely
a challenge since it is a matter of achieving knowledge about living
bodies that are different from,
yet evolutionarily connected to, your living body."

Poets are much better at "describing" than scientists. They understand
how connotation works - and it gives them much greater precision in
their communication (Ezra Pound writes wonderfully about this in his
"ABC of reading"). Scientific discourse attempts to ignore connotation
in favour of denotation - and yet it cannot escape connotations.
Personally, I find my intellectual - or emotional - response to
academic arguments is a connotative response which academic convention
demands I put into some denotative form. (Really I should express it
in music.)

This tension is played out in practical life in education. What do we
do in education? Do we attempt to make others as we would want them to
be? - this appears to be the view of education promulgated by our
politicians today. It only seems to make everybody miserable - it
turns people into the subjects of denotative judgements. Good teachers
will form connotative judgements by listening carefully - to the
individual and to society. Sometimes it only takes listening.

I become mindful that better description might be achieved by talking
less and listening more. Is listening describing?

Best wishes,

Mark







On 1 May 2016 at 06:59, Loet Leydesdorff  wrote:
> Dear colleagues,
>
>
>
> In my opinion, one can distinguish between the order of generation and
> emerging control. While consciousness (perhaps) arises from matter in terms
> of its generation, and language perhaps from movements, once these
> next-order systems levels have arisen, they tend to take over control and to
> reorganize the order within and among underlying systems levels. Language,
> for example, can further be developed into specialist languages, computer
> languages, etc., which affect (discipline) our behavior from above. The
> reduction of the complexity of language—used among other things to give
> meaning to events—to linguistic behavior by language carriers becomes then
> one research program among other research programs (e.g., Stan’s program to
> organize the world in terms of hierarchies).
>
>
>
> One observes historical instantiations that may be organized along
> trajectories. Evolutionary, this generates variation and remains
> phenotypical/phenomenological. The selection mechanisms are not directly
> observable; they are specified by us in scholarly discourse and
> knowledge-based. Their specification sometimes provides more sophisticated
> (since theoretically informed) meaning to the same phenomena. The puzzle
> fascinating me is how this knowledge and information order transforms the
> underlying orders; first as feedback, but then increasingly as feedforward.
> Note that this does not de-legitimate the reductionists programs, but
> reduces their philosophical aspiration to one among possible research
> programs.
>
>
>
> Best,
>
> Loet
>
>
>
> 
>
> Loet Leydesdorff
>
> Professor, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; http://www.leydesdorff.net/
> Associate Faculty, SPRU, University of Sussex;
>
> Guest Professor Zhejiang Univ., Hangzhou; Visiting Professor, ISTIC,
> Beijing;
>
> Visiting Professor, Birkbeck, University of London;
>
> http://scholar.google.com/citations?user=ych9gNYJ=en
>
>
>
> From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Alex Hankey
> Sent: Sunday, May 01, 2016 1:09 AM
> To: FIS Webinar
> Cc: Maxine Sheets-Johnstone
> Subject: [Fis] _ Re: _ FIS discusion
>
>
>
> In Answer to Maxine's comments
>
>
>
> While I understand Maxine's concern that we remain a phenomenological
> orientation in these discussions, and am gratified that in places we do seem
> to be achieving that, I also feel that many of us are here to bring our own
> particular perspectives, whether in Maths (Louis), Physics (myself), or
> Philosophy (albeit with Pragmatist leanings - Soren Brier), and to leave the
> phenomenologists themselves (such as Maxine) to take what is of use and
> translate it more precisely into terms that phenomenologists will accept
> more readily.
>
>
>
> For myself, I often have to listen to ideas (or students' questions) from
> those not familiar with strict scientific technicalities, and then to answer
> them in a language chosen for to try and avoid them being swamped (blinded?)
> by science.
>
>
>
> At the same time, I would like to thank Maxine for the depth and clarity of
> her thoughts - particularly her comment, "The bodies we are not", which I
> read through Vedanta-tinged spectacles (!!), 

Re: [Fis] DISCUSSION SESSION: INFOBIOSEMIOTICS

2016-04-10 Thread Mark Johnson
I'd be interested to know whether "mattering" is considered within
"meaning". I suspect "mattering" is distinct.

thinking aloud

Science isn't just meaningful. It matters to scientists. Perhaps it's
only because it matters to some people, it exists.

Re. meaning, I think the connection between meaning and expectation is
correct. I agree Shannon is helpful for constructing approaches to
explore this. But we expect many things, yet only a few of them really
matter.

There are many varieties of transcendental argument about information
which start from assumed mechanistic properties of nature. Yet we have
no certainty about whether nature's apparent regularities are real or
not - it is conjecture. There does appear to be a kind of "cybernetic
functionalism" (which I think is what Soren is complaining about)
which maintains scepticism about reality at one level, and positivism
at another. Not all cybernetics is culpable of this however. I would
be interested in an approach to information which avoids untestable
assumptions about "natural necessity".

Is there a "personalist" interpretation of information which starts
from concrete personal being (note that 'personal' does not have to
mean "individual": persons well be relations), and does not seek to
reduce personal being to more abstract "foundations"?

From a personalist perspective, information may simply be constraint.
Is the difference between things that matter and things that mean
something differences in relations between constraints? Bateson's
double-bind, which definitely matters to those caught in it, is a
particular dynamics of constraint. Bateson also specified constraint
dynamics in what he called "symmetrical schizmogenesis" (seen in
tit-for-tat engagements, fighting) and "complementary schizmogenesis"
(seen in master slave relations). This is a good start

A question which I don't think Bateson addresses, but maybe Ashby had
an idea about, is what science would look like if we sought agreement
about the constraints which we share rather than our theories about
causation. I don't think that would be a functionalist pursuit.

Best wishes,

Mark







On 9 April 2016 at 11:21, Loet Leydesdorff  wrote:
> Dear Pedro,
>
>
>
> I disagree about putting "meaning" outside the scope of natural sciences.
>
>
>
> I doubt that anybody on this list would disagree about using the metaphor of
> meaning in the natural sciences.
>
>
>
> Maturana (1978, p. 49): “In still other words, if an organism is observed in
> its operation within a second-order consensual domain, it appears to the
> observer as if its nervous system interacted with internal representations
> of the circumstances of its interactions, and as if the changes of state of
> the organism were determined by the semantic value of these representations.
> Yet all that takes place in the operation of the nervous system is the
> structure-determined dynamics of changing relations of relative neuronal
> activity proper to a closed neuronal network.”
>
> http://www.enolagaia.com/M78BoL.html#Descriptions
>
>
>
> In other context, Maturana used the concept of “languaging”.
>
>
>
> My point is about the differentia specifica of inter-human communication
> which assumes a next-order contingency of expectations structured by
> “horizons of meaning” (Husserl). One needs a specific (social-science) set
> of theories and methods to access this domain, in my opinion. In concrete
> projects, one can try to operationalize in terms of the information sciences
> / information theory. One can also collaborate “interdisciplinarily” at the
> relevant interface, notably with the computer sciences. The use of metaphors
> in other disciplines, however,  cannot be denied.
>
>
>
> This is just a reaction; I had one penny left this week. J
>
>
>
> Best,
>
> Loet
>
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>



-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Visiting Professor
Far Eastern Federal University, Russia

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] _ Re: _ Re: _ DISCUSSION SESSION: INFOBIOSEMIOTICS

2016-04-04 Thread Mark Johnson
Dear Lou,

"The fusion of thought with itself is the place from which we can
understand the fusion of ourselves with Nature in a unity that
precedes the apparent distinctions that we take for granted."

This is a powerful thought!

It looks like a statement of Husserlian phenomenology - it
transcendentalises thought. We might ask, What does "understand" mean
here? Where does "acting" fit? Where are "real people" with bodies and
emotions? Of course, these issues can be addressed, but to think like
this is a path which is chosen, rather than being a statement of fact.

The point in my post concerns politics rather than phenomenology.
Human dignity and freedom are at the heart of politics. I'd once
thought that there was little politics in Bateson - only epistemology.
Now I think this was wrong (double-binds are political aren't they?).
Bateson's distinguishing the way people think from the way nature
works is a critical (political) point. It's also a choice.

I've been thinking about the different choices which might be made
with regard to the relationship between nature and thought - and where
each might lead. Within them are different approaches to
"information".

1. The elision of thought and nature and transcendentalising thought:
we have to be careful with statements like "thought is part of nature"
or that it is inseparable from nature. Lack of precision in such
statements is easily manipulated to excuse bad things, or to decry
empirical observations.

2. Separating thought from nature (as Bateson appears to do) and
transcendentalising constraint. Where does this choice lead? I think
Ashby's epistemology maintained something like this: it leads to the
pursuit of error, and the exposure of the limits which nature bears
upon thought, and thought on nature. It's a different kind of
methodology.

3. to transcendentalise nature. This leads to a classical empiricist
epistemology - the methodical investigation of event regularities in
experiments, and the construction of knowledge about possible causes.

Each "choice" has value. Each seeks to subsume the others within its
scheme. Each is defined in relation to the others. How to choose? With
what criteria?

Furthermore, each choice is 'personal': individuals have particular
reasons for adhering to one approach and disliking others.

Is this resolvable?

Best wishes,

Mark

On 4 April 2016 at 06:51, Louis H Kauffman <kauff...@uic.edu> wrote:
> Dear Mark,
> 1. The way we think is part of how Nature works.
> 2. Thought is not separate from our contact with Nature.
> 3. Concept arises in the integration of thought and percept.
> 4. Thought is singular in that thought can be the object of thought and this 
> becomes a place where subject and object are coalesced.
> 5. The fusion of thought with itself is the place from which we can 
> understand the fusion of ourselves with Nature in a unity that precedes
> the apparent distinctions that we take for granted.
> Best,
> Lou K.
>
>> On Apr 3, 2016, at 3:49 AM, Mark Johnson <johnsonm...@gmail.com> wrote:
>>
>> Dear Soren, Lou and Loet,
>>
>> I can appreciate that Bateson might have had it in for hypnotists and
>> missionaries, but therapists can be really useful! Had Othello had a
>> good one, Desdemona would have lived – they might have even done some
>> family therapy!
>>
>> More deeply, Bateson’s highlighting of the difference between the way
>> we think and the way nature works is important. How can a concept of
>> information help us to think in tune with nature, rather than against
>> it?
>>
>> Loet’s description of social systems as encoded systems of
>> expectations within which selections are made is helpful. A concept of
>> information is such a selection. But we live in a world of finite
>> resources and our expectations form within what appear to be real
>> limits: Othello saw only one Desdemona. Similarly, there appears to be
>> scarcity of food, money, shelter, safety, education, opportunity for
>> ourselves and for our children upon whose flourishing we stake our own
>> happiness. These limits may be imagined or constructed, but their
>> effects are real to the point that people will risk their lives
>> crossing oceans, fight and kill for them. This is a result of how we
>> think: it leads to hierarchy, exclusion and the production of more
>> scarcity. Nature appears not to work like this.
>>
>> If we accept that the way we think is fundamentally different from the
>> way nature works, how might a concept of information avoid
>> exacerbating the pathologies of human existence? Wouldn’t it just turn
>> us into information bible-bashers hawking our ideas in online forums
>> (because universities a

[Fis] _ Re: _ Response to several commentators:

2016-03-01 Thread Mark Johnson
I'm really grateful for Maxine's summary here.

To those who query the value of phenomenology, I find myself reflecting on
what Alain Badiou and a number of others (e.g. Burrell and Morgan's
"Sociological Paradigms and Organisational Analysis") have argued  in there
being (at least) three main trends in Western thought in the 20th century:
Analytical philosophy/functionalism, phenomenology and existentialism, and
Marxism.

Each 'paradigm' (to borrow Burrell and Morgan's word) represents an
important perspective, and each harbours totalising ambitions seeing itself
in contradistinction to the others. Most thinking about "information" is
analytical and functionalist in orientation: obviously with Shannon, but
even the most ecological and "second-order" of cybernetic approaches cannot
escape functionalism. From functionalism, there will always be a tendency
to criticise phenomenology as in some way as "woolly" or Marxist critique
as "blah blah sociology".

Phenomenologists of all kinds will rightly point out that functionalism
cannot explain everything. Many of them, like Husserl, have intimate
familiarity with analytical and functionalist arguments. As Husserl saw,
mathematical abstractions rest on foundations in the human soul to which it
is blind. Whether or not Husserl was successful in articulating a way to
uncover the structures of consciousness is beside the point: it was, and
continues to be, a profoundly important question to which functionalism has
no answer.

And then there's politics. Because from academic jousting matches in online
fora, to open access to academic papers, to the pathologies of our
universities who are meant to support debates but who now operate like
businesses, to catastrophic disparities in life chances in the world, and
inequalities of wealth, the social context demands action as much as it
does reflection.

If "information" as a topic is to have any value then it cannot restrict
itself to functionalism and analysis alone. It has to address all three
perspectives (and maybe more). I don't know if this is possible. But I see
that the value of encountering phenomenology as we have done is to
highlight the deficiencies of a single-paradigm viewpoint.

best wishes,

Mark

On 29 February 2016 at 23:01, Maxine Sheets-Johnstone 
wrote:

> To FIS Colleagues,
>
> There are common threads running through communications from Mark, Loet,
> Jerry, and Marcus that I would like to address. I thank them for their
> concerns and the issues they raise. I thank Plamen too for his response,
> specifically for upholding the value of phenomenology, though disagreeing
> with him in his giving prominence to Merleau-Ponty as a phenomenologist. I
> would like to comment on that point of disagreement first.
>
> (1): I just wrote an invited essay on Merleau-Ponty for an Oxford book on
> Phenomenology and Psychopathology. I noted first off that
> "Merleau-Ponty’s writings in psychopathology were both exceptional and
> non-exceptional. They were exceptional in bringing scientific research into
> phenomenology. Husserl had written from time to time on the abnormal—for
> example, in Ideas II, Husserl considers what transpires when a particular
> sense organ no longer functions normally while others continue to do so
> (Husserl 1989, pp. 71ff.)—but he did not delve into the
> psychopathological.  Heidegger too might be cited: the ‘they’ might be
> viewed as metaphysically abnormal, the ‘they’ being those who repress
> recognition of their own mortality, who see death as happening only to
> others, and whom Heidegger deems ‘inauthentic’. Merleau-Ponty, in contrast,
> delved into contemporary studies of psychopathology, in particular, the
> extensive studies of Kurt Goldstein and Adhémar Gelb. He also based his own
> psychopathological analyses to a large extent on the writings of Sigmund
> Freud even as he diverged from them. Thus one might say that he devoted
> himself assiduously to available contemporary literature in the then
> burgeoning fields of neuropsychiatry and psychoanalysis."
>
> More of Merleau-Ponty, Phenomenology, and Psychopathology perhaps at
> another time. For a later time too, perhaps, Merleau-Ponty’s affiliating
> himself with biologist Jakob von Uexküll, undoubtedly because von Uexküll’s
> conjunction of animal and world in “functional tones” connected with
> Merelau-Ponty’s own conjunction of seer/seen and touching/touched, and
> possibly also because Merleau-Ponty’s disavowal of Darwin’s “origin of
> species” i.e., Darwin’s theory of natural selection, straightaway agreed
> with  von Uexküll’s disavowal of Darwin’s “origin of species.”
>
> What is of preeminent note here is that Merleau-Ponty never engaged in the
> actual practice of phenomenology. He thereby threw away the backbone of
> phenomenology, namely, its methodology. Phenomenological methodology is the
> topic warranting serious address here in our discussion. I’ve mentioned it
> in earlier responses but 

Re: [Fis] _ Pirate Bay of Science

2016-02-13 Thread Mark Johnson
hear, hear!

There's something important about the politics of information in this case.
Sociologist Steve Fuller has argued that the open access movement is merely
a "consumerist revolt, academic style" (see
http://sociologicalimagination.org/archives/9953/comment-page-1). It's an
interesting case he makes, but I think he's wrong.

Is there a connection between Floridi's information ethics and open access
where a more defensible justification grounded in information science can
be made? Or some other theoretical framework?

best wish,

Mark

On 13 February 2016 at 16:12, Bob Logan  wrote:

> Dear FISers fyi - Bob
>
>
>
>
> Begin forwarded message:
>
>
>
>
> http://www.sciencealert.com/this-woman-has-illegally-uploaded-millions-of-journal-articles-in-an-attempt-to-open-up-science
>
> Researcher illegally shares millions of science papers free online to
> spread knowledge
>
> Welcome to the Pirate Bay of science.
> FIONA MACDONALD
> 12 FEB 2016
>
> A researcher in Russia has made more than 48 million journal articles -
> almost every single peer-reviewed paper every published - freely available
> online. And she's now refusing to shut the site down
> , despite a
> court injunction and a lawsuit from Elsevier, one of the world's biggest
> publishers.
>
> For those of you who aren't already using it, the site in question is
> Sci-Hub , and it's sort of like a Pirate Bay of the
> science world. It was established in 2011 by neuroscientist Alexandra
> Elbakyan, who was frustrated that she couldn't afford to access the
> articles needed for her research, and it's since gone viral, with hundreds
> of thousands of papers being downloaded daily. But at the end of last year,
> the site was ordered to be taken down by a New York district court
> 
>  - a ruling that Elbakyan has decided to fight, triggering a debate over
> who really owns science.
>
> "Payment of $32 is just insane when you need to skim or read tens or
> hundreds of these papers to do research. I obtained these papers by
> pirating them,"Elbakyan told Torrent Freak last year
> 
> . "Everyone should have access to knowledge regardless of their income or
> affiliation. And that’s absolutely legal."
>
> If it sounds like a modern day Robin Hood struggle, that's because it
> kinda is. But in this story, it's not just the poor who don't have access
> to scientific papers - journal subscriptions have become so expensive that
> leading universities such as Harvard
> 
>  and Cornell
>  have
> admitted they can no longer afford them. Researchers have also taken a
> stand - with 15,000 scientists vowing to boycott publisher Elsevier
>  in part for its excessive paywall fees.
>
> Don't get us wrong, journal publishers have also done a whole lot of good
> - they've encouraged better research thanks to peer review, and before the
> Internet, they were crucial to the dissemination of knowledge.
>
> But in recent years, more and more people are beginning to question
> whether they're still helping the progress of science. In fact, in some
> cases, the 'publish or perish' mentality
> 
>  is creating more problems than solutions, with a growing number of
> predatory publishers now charging researchers to have their work published
> - often without any proper peer review process or even editing
> 
> .
>
> "They feel pressured to do this," Elbakyan wrote in an open letter to the
> New York judge last year
> . "If a researcher
> wants to be recognised, make a career - he or she needs to have
> publications in such journals."
>
> That's where Sci-Hub comes into the picture. The site works in two stages.
> First of all when you search for a paper, Sci-Hub tries to immediately
> download it from fellow pirate database LibGen
> . If that
> doesn't work, Sci-Hub is able to bypass journal paywalls thanks to a range
> of access keys that have been donated by anonymous academics (thank you,
> science spies).
>
> This means that Sci-Hub can instantly access any paper published by the
> big guys, including JSTOR, Springer, Sage, and Elsevier, and deliver it to
> you for free within seconds. The site then automatically sends a copy of
> that paper to LibGen, to help share the love.
>

Re: [Fis] Locality & Five Momenta . . .

2015-10-30 Thread Mark johnson
Dear FIS colleagues,

I'm curious about why the discussion about momenta matters. Does it matter 
because we believe it is important to determine the boundaries of specific 
discourses? Does that matter because we fear incoherence or confusion in our 
discussion if we don't demarcate boundaries? And yet the determination of 
discourse boundaries throws in more complexity into the debate: The coherent 
discourse we might hope for runs away from us as we try to grasp it.

What assumptions are we making in asserting momenta? What might it preclude? It 
seems to me that a moment-orientation carries a philosophical realist undertone 
(I'm familiar with it from Bhaskar's critical realism - this looks similar to 
his dialectical MELDA formulation - difficult to get into but insightful) There 
is an assumption about observers, and there is an assumption about natural 
necessity - what speculative realists call 'correlationism'. 

It seems that it matters more to academics than to ordinary people that deep 
ontological issues should be decided upon. Science, after all, depends on 
continual critical questioning about nature. Yet most sensible non-academic 
people might prefer to have a drink with friends than hurt their brains:  might 
they take a more pragmatic view that such matters of deep ontology are 
essentially undecidable? Or that their deep social (love) relationships really 
matter beyond everything else?

Before embarking on schematising momenta, perhaps it would be useful to think 
about what matters in this. Fundamentalism is an ever present risk for all 
academics. And additionally, information has a bearing on both matter and 
mattering after all...

Best wishes,

Mark


-Original Message-
From: "Stanley N Salthe" 
Sent: ‎30/‎10/‎2015 13:24
To: "Marcus Abundis" <55m...@gmail.com>; "fis" 
Subject: Re: [Fis] Locality & Five Momenta . . .

Marcus wrote:
– I find myself thinking Five Momenta must represent five types of localities. 
I ask if that “smells right” to you. If so, I would think that “localizing 
hierarchies” would also be needed. For example, I see: 1) passive descriptions 
of Nature (aka natural philosophy, general science) as a different locality 
than, 2) anthropogenic or anthropocentric deeds (human semiotics+acts). One 
might even then add 3) biological processes mediating between 1 & 2.  All 
represent essentially different systems of meaning, no? But then, the Five 
(suggested) Momenta would be subordinate to 1, 2, and 3 in different ways, as I 
read things. Evaluation (cataloguing) of different localized traits seems to me 
as a possible useful path. Thoughts?
Marcus -- The momenta as given my Pedro:philosophy, biomolecular, 
multicellular, sociality, information do not make up a logical hierarchy, 
either subsumptive nor compositional. One possible, idealistic, reading is in 
subsumption:
{mind {microbiology {macrobiology {sociality {conceptualization}
STAN



On Fri, Oct 30, 2015 at 2:49 AM, Marcus Abundis <55m...@gmail.com> wrote:

Loet, thanks for your note (Sat Oct 24) . . . an interesting twist on things I 
had not been considering.



John, re (Tue Oct 27) “rigorous connections using the entropy concept . . . 
most people don't understand entropy . . . So I haven't published”
– This interests me, as my own work heads in a general “entropic” direction.


Pedro, Steve & Stan – re various notes on Locality, Five Momenta and Hierarchy.
– I find myself thinking Five Momenta must represent five types of localities. 
I ask if that “smells right” to you. If so, I would think that “localizing 
hierarchies” would also be needed. For example, I see: 1) passive descriptions 
of Nature (aka natural philosophy, general science) as a different locality 
than, 2) anthropogenic or anthropocentric deeds (human semiotics+acts). One 
might even then add 3) biological processes mediating between 1 & 2.  All 
represent essentially different systems of meaning, no? But then, the Five 
(suggested) Momenta would be subordinate to 1, 2, and 3 in different ways, as I 
read things. Evaluation (cataloguing) of different localized traits seems to me 
as a possible useful path. Thoughts?


Re Chatin – an interesting article, to be sure, but for the reasons Joesph 
points out (and more) I agree with his posted thoughts.


Finally, in following the posted notes, I find this “discussion about 
discussion“ instructive.








___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Shannon-Weavers' Levels A, B, C.

2015-10-17 Thread Mark Johnson
Dear Loet, Joseph and Fis colleagues,

Some thoughts:

Pascal: "the heart has its reasons [the constraints of the body] which the
reason cannot perceive [because it is abstract]" and yet... we do come to
know the reasons of the heart - we know them long before we know reason. In
language as Joseph says "Less is more" precisely, in my experience at
least, because ambiguities reveal the reasons of the heart. Poetic language
lifts the veil of everyday language to expose the raw, embodied, uncodified
constraints which underpin it. Music is more powerful still (Alfred Schutz
wrote about this wonderfully)

Can we fashion a description of how this works with existing theory? (I
don't believe we should surrender the territory to psychologists!)

Parsons's idea of 'double-contingency' of communication presents an
interaction between ego and alter where communication emerges through
selections of meaning and utterance of each party. Schutz, whose theory of
intersubjectivity was important for Parsons (they had an significant and
difficult correspondence about these matters which is well-documented in
Richard Grathoff's "The Theory of Social Action: The Correspondence of
Alfred Schutz and Talcott Parsons") found Parsons's model too
functionalist. Parsons and Schutz have a different understanding of how
people 'tune-in' to one another: I see Parsons's model as effectively
'digital': a set of multi-level interacting selections; Schutz's concept is
more 'analogue', involving sharing a sense of 'inner time' between people.
I prefer to think of this as a shared constraint.

Loet's redescription of double-contingency in terms of mutual redundancies
loosens the determinism in both Parsons and Luhmann (who followed him). I
think this is important, and opens a space for reconsidering Schutz's
understanding of how 'tuning-in' might work.

It's best to start with 'selection' (of utterance, meaning and
understanding in Luhmann). Shannon selections are constrained by redundancy
as we know, so to turn the spotlight on the redundancies rather than what
is selected allows us to differentiate the intersubjective communication
between two people talking face-to-face as one of higher mutual
constraint/redundancy than the intersubjective situation of writing an
email or a listserve post to Fis. Locality makes a difference in increasing
mutual constraint.

Returning to Pascal, my body constrains my thoughts in ways which cannot be
abstractly modelled, and yet I can apprehend my own constraints and those
of others, whilst not necessarily being able to articulate them in
language. I could however make music, wave my arms around, pull an angry
face, or cry. Isn't inference of constraints by observing such behaviour
essential to communication? How could double contingency work were we not
able to grasp and physically feel what constrains the other? Babies
wouldn't survive otherwise!

Just to extend the speculation a bit further, we should ask about the
process of knowledge construction itself in the light of mutual redundancy.
Since Hume, many believe that the agreement of scientists in the light of
event regularities is a factor in the development of knowledge. What do
those scientists 'tune into' when they do this? In what way might an
empirical event regularity be a mutual constraint? How are physical
constraints separable from personal, biological or psychological
constraints?

Might apparently 'woolly' (but, IMO, valuable) sociomaterial accounts of
science be reframed as analytical accounts of interacting constraints?

just some thoughts...

Mark

On Thu, Oct 15, 2015 at 1:38 PM, Loet Leydesdorff 
wrote:

> Dear Marcus, Mark, Bob, and colleagues,
>
>
>
> My ambition was a bit more modest: the paper does not contain a theory of
> meaning or a theory of everything. It is an attempt to solve a problem in
> the relation between sociology (i.c. Luhmann) focusing on meaning
> processing (and autopoiesis) and (Shannon-type) information theory. Luhmann
> left this problem behind by defining information as a selection, while in
> my opinion entropy is a measure of diversity and therefore variation. I was
> very happy to find the clues in Weaver’s contributions; Katherine Hayles
> has signaled this previously.
>
>
>
> Another author important in the background is Herbert Simon who specified
> the model of vertical differentiation (1973), but without having Maturana &
> Varela’s theory of autopoiesis for specification of the dynamics. I agree
> with Luhmann that one has to incorporate ideas from Husserl about horizons
> of meaning and Parsons’ symbolically generalize media as structuring these
> horizons for understanding the differentia specifica of the social as
> non-biological.
>
>
>
> Mark more or less answers his own questions, don’t you? The constraints of
> the body provide the contingency. The options are not given, but
> constructed and need thus to be perceived, either by individuals or at the
> organizational 

Re: [Fis] Shannon-Weavers' Levels A, B, C.

2015-10-14 Thread Mark Johnson
Maybe I've missed something, but the subsumption I mentioned
(referring to Bateson) was not between A, B and C: these are
co-existent interacting dynamics as I understand them, and certainly a
very rigorous and powerful generative model.

I was worrying about subsumption of Bateson's "imagination" into
"rigour" Loet's model does have 'imagination' in it in the
generation of redundancies. But does it include the human imagination
capable of conceiving a model of itself?

I wonder if a possible answer to the question lies in Loet's work.
Human embodiment is a constraint which an abstract rigorous model can
never have. Within dynamics of mutual redundancy, won't the
complexities of mutual redundancies of embodied existence will always
outweigh the mutual redundancies that can be abstractly modelled?

best wishes,

Mark


On Wed, Oct 14, 2015 at 9:32 PM, Robert E. Ulanowicz  wrote:
>
>> On 2015-10-14, at 12:38 PM, Marcus Abundis wrote:
>>
>>> RE Mark Johnson's post of Thu Oct 1 09:47:13 on Bateson and imagination
> Two quick remarks:
>
> 1. It's not at all clear to me that C is subsumptive of B.
>
> 2. I would lobby for Shannon/Bayesian relationships as an intermediary
> between A. and B (i.e., preliminary to "meaning").
>
> Cheers to all,
> Bob U.
>
>>> . . .
>>>  – Me Too!
>>>
>>> RE Loet & Stan's postings beginning Thu Oct 1 21:19:50 . . .
>>> >  I suggest to distinguish between three levels (following Weaver): <
>>> > A. (Shannon-type) information processing ; <
>>> > B. meaning sharing using languages;<
>>> > C. translations among coded communications.<
>>> > So, here we have a subsumptive hierarchy"<
>>>
>>> I was wondering if this note means to imply an *all inclusive* list of
>>> traits to be considered in modeling information? Or, alternatively . . .
>>> what would such an all inclusive list look like?
>>>
>>> Thanks!
>>>
>>>
>>>
>>> Marcus Abundis
>>> about.me/marcus.abundis
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] [Fwd: Re: Information is a linguistic description of structures]--T...

2015-10-01 Thread Mark Johnson
Dear Loet and colleagues,

I wonder if an alternative view is possible: that the symbolic
codification of the sciences inherent in discourse and supported by
our universities (as they are currently constituted) is a constraint
which prevents us exploring a proper science of constraint. To
overcome it requires not just words and papers, but new forms of
practice, pedagogy, organisation and innovative uses of technology
(possibly what Gordon Pask referred to as 'maverick machines').
Expectations of academic practice - particularly within University
management - make this very hard to establish. Gregory Bateson
identified this very clearly - I recommend his essay at the end of
"Mind and Nature", "Time is out of joint".

There are perhaps some encouraging signs: the practices of artists and
musicians with new technologies, for example, or innovative approaches
to design. The challenge in taking such things more seriously lies in
thinking creatively about how we talk with each other about them.
Bateson understood the problem: he called it the "anti-aesthetic
assumption" which "Bacon, Locke and Newton long ago gave to the
physical sciences, viz that all phenomena (including the mental) can
and shall be studied and evaluated in quantitative terms."

Bateson's argument is that there are two complementary aspects to
mental process: a conservative, rigorous inner logic that demands
compatibility and conformance, and an imaginative, adaptive response
by nature in order to survive in a changing world. It is a mistake, I
think, to subsume the imaginative within the 'conservative inner
logic', which is the tendency of the language-oriented view of the
world. Somehow the balance has to be struck: "Rigour alone is
paralytic death, but imagination alone is insanity"

The point is that this has to be struck organisationally and
institutionally. Bateson ends by asking the Board of Regents at the
University of California (in 1978) "Do we, as a Board, foster whatever
will promote in students, in faculty, and around the boardroom table
those wider perspectives which will bring our system back into an
appropriate synchrony or harmony between rigour and imagination?" It's
an important question. How many university managers would even
understand it today?

Best wishes,

Mark





On Thu, Oct 1, 2015 at 7:14 AM, Loet Leydesdorff  wrote:
> in other words, it's time we confess in science just how little we know
> about language, that we explore language's mysteries, and that we use our
> discoveries as a crowbar to pry open the secrets of this highly contextual,
> deeply relational, profoundly communicational cosmos.
>
>
>
> Dear colleagues,
>
>
>
> The vernacular is not sufficiently codified to contain the complexity of the
> sciences. One needs specialized languages (jargons) that are based on
> symbolic codification. The codes can be unpacked in elaborate language; but
> they remain under re-construction. The further differentiation of codes of
> communication drives the complexity and therefore the advancement of the
> sciences as discursive constructs.
>
>
>
> This cultural evolution remains rooted in and generated by the underlying
> levels. For example, individuals provide variety by making new knowledge
> claims. Since the selection is at the level of communication, however, this
> level tends to take over control. But not as an agent; it further
> differentiates into different forms of communication such as scientific
> discourse, political discourse, etc. Sociologists (Parsons, Luhmann) have
> proposed “symbolically generalized media of communication” which span
> horizons of meaning. “Energy”, for example, has a meaning in science very
> different from its meaning in political discourse. Translations remain of
> course possible; local organizations and agents have to integrate different
> meanings in action (variation; reproduction).
>
>
>
> In my recent paper on the Self-organization of meaning (at
> http://arxiv.org/abs/1507.05251 ), I suggest to distinguish between three
> levels (following Weaver): A. (Shannon-type) information processing ; B.
> meaning sharing using languages; C. translations among coded communications.
> The horizontal and vertical feedback and feedforward mechanisms (entropy
> generation vs. redundancy generation in terms of increasing the number of
> options) are further to be specified.
>
>
>
> Hopefully, this contributes to our discussion.
>
>
>
> Best,
>
> Loet
>
>
>
>
>
> 
>
> Loet Leydesdorff
>
> Professor Emeritus, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; http://www.leydesdorff.net/
> Honorary Professor, SPRU, University of Sussex;
>
> Guest Professor Zhejiang Univ., Hangzhou; Visiting Professor, ISTIC,
> Beijing;
>
> Visiting Professor, Birkbeck, University of London;
>
> http://scholar.google.com/citations?user=ych9gNYJ=en
>
>
>
>
>
>
> 

Re: [Fis] Answer to Mark

2015-08-01 Thread Mark Johnson
Dear Fernando,

Without wanting to spawn a new debate, I think it might be useful to flag
something up about the 'phenomenology' that you mention. I understand
Joseph's reaction to what to you say and I agree. However, phenomenology is
a rich a complex topic, and few scholars have the tenacity to delve deeply
into the difficult and detailed thinking of Husserl, Heidegger, Schutz,
tracing it's evolution in French existentialism, hermeneutics, or from
Schutz to Berger, Luckmann, Parsons and then Luhmann. At the very least
there is the division between Husserlian transcendental phenomenology with
its transcendental ego to which Heidegger and many others objected, and
the existential phenomenology of everyday experience which Heidegger
developed instead. Husserl, for his part thought Heidegger had completely
misunderstood him. To say he might have been right is not to take away the
genius of Heidegger's own insights.

The point is, when we say phenomenology, what do we mean?

Joseph's concern relates (I think) to what appears to be a missing account
of intersubjectivity in your paper. But of course, intersubjectivity was
a central concern for Husserl, and his ideas on it were much refined by
Schutz, who seems to me to be a critically important figure (I'm grateful
to Loet for pointing me in Schutz's direction!). To be 'phenomenological'
does not preclude intersubjectivity. However, if you are Heideggerian, then
I think it is true that Heidegger's understanding of human relations is
rather weak (interesting to reflect on this in relation to Heidegger's
politics!)

I suspect that the phenomenological literature and its history is of
considerable relevance to current debates about information.

Best wishes,

Mark

On Thu, Jul 30, 2015 at 9:17 AM, Fernando Flores 
fernando.flo...@kultur.lu.se wrote:

 Dear Mark



 Thanks for your commentaries. Our use of the term “foundational” is more
 philosophical than practical. You are right; the term contradicts in some
 sense our intentions which are “very” practical. (This is a term which we
 could leave behind without hesitation.) In fact, we have no intentions in
 “instituting” a new concept of “information”. Our work is “foundational”
 only in one aspect, and that is in searching for methods to measure the
 informational value of collective acts in everyday life. We found that it
 was necessary to classify human acts in such a way that their informational
 value could be “operative” (useful in practical tasks); we did that,
 grouping the acts in types depending on their complexity. We found that
 these acts could also be distinguished in relation to their consequences on
 the everyday world. We noticed that the movement from the very complex acts
 to the simplest acts follows a reduction of the surrounding world and that
 the human body is the natural reference in the understanding of this
 reduction. We knew that we could express informational value in relation to
 probabilities and found in the von Mises/Popper frequency series a possible
 and easy solution (an accessible mathematics). We insist; we have been
 working only with practical problems and we have not been thinking so much
 of which concept of information we are using; we believe that cybernetics
 does not address the practical problems we confront. However, we are sure
 that if we succeed, some cybernetic theorem will explain our success. The
 question is that the state of knowledge we have today is insufficient to
 understand the simplest informational problems in our surrounding world.
 Informational theory and cybernetics have been developed in the world of
 Physics; instead, we try to develop solutions that work in everyday life.
 If you understand as “variety” the measure of the “states of a system”, the
 series of von Mises/Popper could be our kind of variety, but we are not
 sure. You are certain, our “acts” are neither “actions” nor “events”, but
 they are not the hybrids of Latour either. Our acts are phenomenological;
 they are intended to be congruent with concepts as “work”, “money”,
 “culture”, “thing”, “market”, and the like. The concept “informational
 value” for example, is very close to the concept of “information” without
 meaning exact the same.





 Fernando Flores PhD

 Associate Professor

 History of Ideas and Sciences

 Lund University



 ___
 Fis mailing list
 Fis@listas.unizar.es
 http://listas.unizar.es/cgi-bin/mailman/listinfo/fis




-- 
Dr. Mark William Johnson
Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information Foundation of the Act--F.Flores L.deMarcos

2015-07-29 Thread Mark Johnson
Dear Drs Flores and de-Marcos


I very much enjoyed reading this paper: there is a lot to reflect on and I
need to spend more time with it, but I have some immediate questions which
I hope you can address.



   1.

   The first concerns the concept of the ‘foundations’ in your title.
   Having lurked on this list for some time, I see talk about foundations
   forming the bulk of the discussions. Foundations can frame narratives or
   explanations which can generate new imaginative possibilities and reveal
   new ways of exploring those possibilities in nature: perhaps what Gordon
   Pask meant when he called cybernetics the “art and science of manipulating
   defensible metaphors”. But foundations can also be blinding and clever
   people can become overly-attached to them (Pask’s obsession with
   conversation may be a good example!). What does it mean to you to say
   information is a ‘foundation’ for acts?
   2.

   Your approach to information seems very practical. You make the appeal
   to “information” to justify an approach to ‘counting’ information in acts
   (for example, counting information in technological acts, modernisation,
   etc). In cybernetic terms this seems to me like counting ‘variety’ (I was
   thinking of Beer’s diagram of variety attenuation in the insurance firm –
   Heart of Enterprise, p521). Shannon’s information theory also provides a
   way of counting – Beer uses his equation in the same case on the previous
   page – and maybe Shannon’s counting of “surprises” also says something
   about Beer’s variety counting too. To see counting surprises as not that
   different from counting ‘variety’ is also to see the relationship between
   Shannon’s information and Ashby’s homeostat as clearly complementary. I was
   left wandering if your idea of information conservation wasn’t a disguised
   re-statement of Ashby’s law? How is your concept of information distinct
   from variety?
   3.

   Leading on from both these questions is a further question concerning
   the acts of the scientist. These are not acts like those instrumental acts
   studied in scientific management. They are much messier, appearing to
   scholars like Latour as 'entanglements': Karen Barad puts the radical case
   that in scientific practice “matter and meaning are not separable” (see her
   Meeting the Universe Halfway). I was wondering if you had thought about
   accounting for your own scientific acts in developing this work within your
   informational scheme? Does your concept of information as a foundation for
   acts help us to understand science?


I hope these questions make sense, but many thanks for a very stimulating
and important paper.

Mark

On Wed, Jul 22, 2015 at 1:33 PM, Pedro C. Marijuan 
pcmarijuan.i...@aragon.es wrote:


 * The informational foundation of the act*
 *Fernando Flores*
 Lund University
 fernando.flo...@kultur.lu.se

 *Luis de-Marcos*
 University of Alcalá
 luis.demar...@uah.es

 *See the whole text at: http://fis.sciforum.net/resources/
 http://fis.sciforum.net/resources/*

  Our introducing paper (35 pages) presents a theory that quantifies the
 informational value of human acts. We argue that living is functioning
 against entropy and following Erwin Schrödinger we call this tendency
 “negentropy”. Negentropy is for us the reason behind “order” in social and
 cultural life. Further, we understand “order” as the condition that the
 world reaches when the informational value of a series of acts is low.
 Acting is presented as a set of decisions and choices that create order and
 this is the key concept of our understanding of the variation from
 simplicity to complexity in human acts. The most important aim of our
 theory is to measure non-economic acts trying to understand and explain
 their importance for society and culture. In their turn such a theory will
 be also important to understand the similarities and differences between
 non-economic and economic acts.
 We follow the classical concept according to which informational value is
 proportional to the unlikelihood of an act. To capture the richness of the
 unlikelihood of human acts we use the frequency theory of probability
 developed by Ludwig von Mises and Karl Popper. Frequency theory of
 probability allows us to describe a variety of acts from the must most
 “free” to the least “free” with respect to precedent acts. In short, we
 characterize human acts in terms of their degree of freedom trying to set
 up a scale of the information and predictability carried out in human
 decisions. A taxonomy of acts is also presented, categorizing acts as
 destructive, mechanical, ludic or vital, according to their degree of
 freedom (complexity). A formulation to estimate the informational value in
 individual and collective acts follows. The final part of the paper
 presents and discuss the consequences of our theory. We argue that
 artifacts embed information and that modernization can be understood as a
 one-way