Dear Loet, all,

I'm not sure there's such a problem here. I think there's an important
connection to be made through Bateson. He too talks about the importance of
redundancy (he's not alone in cybernetics, as you know - Von Foerster's
"Self organizing systems and their environments"; and Pask explicitly talks
about redundancy and learning (in his "Cybernetics of Human Learning and
Performance" of 1975)).

In Bateson, I think the key is his emphasis in "Mind and Nature" on
"multiple descriptions of the world". There are redundancies in multiple
descriptions, and yes they are generative of new possibilities which shape
the emerging discourse. But the problem with Aristotelian logic is that it
cannot embrace multiplicity and contradiction. We always seem to hit upon
contradictions - particularly at the boundary between ontology and
epistemology, and feel compelled to go one way or the other. But these are
just different descriptions at different logical levels, aren't they?

One of the things which excites me about John Torday's work is what appears
to be a deepening of Bateson's Jungian distinction between "creatura" and
"pleroma" (Terry has a lot to say about this:
http://www.biourbanism.org/the-pattern-which-connects-pleroma-to-creatura-the-autocell-bridge-from-physics-to-life/)
to something that's consistent with quantum physics: something like David
Bohm's "implicate" vs "explicate" order (I know that Bob Ulanowicz, among
others, has some objections to Bohm's "hidden variables", but perhaps
there's something to explore)

We can (and unfortunately often do) impose an epistemological perspective
upon everything. We can say "it's all talk in the end". But this is a
flattening operation where the multiplicity of description is transduced
into a single description. We lose Alex Hankey's "sense of being stared at"
to start with - and that's a big deal in my opinion. The institutions of
science - the universities - are particularly effective at flattening the
world: "if it can't be described in a textbook and put on a syllabus, it
isn't worth attending to!": the "dead hand of education" pushes ontology
out in favour of epistemology in our institutionally-framed discourse. Our
science cannot advance unless we do something about this.

In Bateson's work on Schizophrenia there is the beginning of an
articulation of a different kind of logic of inclusion, contradiction and
multiple levels - the "Double Bind". My initial (admittedly early-days)
reading of Joseph Brenner and Lupasco suggests to me that the achievement
is to have formalised a similar logic in a way that Bateson never did
(Joseph, please correct me if you don't agree!): there is contradiction,
there is an included middle, there is resolution at a higher level.
Actually, it resonates for me because it seems quite similar to Nigel
Howard's theory of "Metagames" which he formalised in his "Paradoxes of
Rationality", which you and I wrote about once. Bateson himself talked
about "bioentropy" - there's an excellent paper by Peter Harries-Jones on
this here: http://www.mdpi.com/1099-4300/12/12/2359

Finally, I think there is another distinction which is important for John
and which was initially emphasised by Bohm: the distinction between
synthesis and analysis in scientific approaches, and the problem of
description. John protests against "descriptive biology", and aims for an
deeper understanding of generative mechanisms. Bohm writes:

"It is important to call attention to the difference between analysis and
description. The word 'de-scribe' literally means to 'write down', but when
we write things down, this does not in general mean that the terms
appearing in such a description can be actually 'loosened' or 'separated'
into autonomously behaving components, and then put back together again in
a synthesis. Rather, these terms are in general abstractions which have
little or no meaning when considered as autonomous and separate from each
other. Indeed, what is primarily relevant in a description is how the terms
are related by ratio or reason. It is this ratio or reason, which calls
attention to the whole, that is meant by a description. Thus, even
conceptually, a description does not in general constitute an analysis."
(Bohm, "Wholeness and the Implicate Order", p160, Routledge edition)

Bohm, like Bateson, identifies the problem in the way we reason. He
suggests new algebras: he even hints at concepts like nilpotents and
quaternions which have been developed by my colleague Peter Rowlands (see
https://www.liverpool.ac.uk/physical-sciences/events/fpl/)... but
fundamentally it's a different kind of logic.

Best wishes,

Mark

On 14 January 2018 at 08:37, Loet Leydesdorff <l...@leydesdorff.net> wrote:

> Dear Joe and colleagues,
>
> This seems counterproductive to me. The generative mechanism is
> knowledge-based; notably based on discursive knowledge. The latter is
> specific to the human species. Dolphins and dogs may be able to language,
> but they cannot handle a credit card or understand the rule of law. In my
> opinion, the generative mechanism is the generation of redundancy by making
> further distinctions and thus options. Meaning cannot be communicated, but
> it can be shared and be redundant. Information is communicated and
> generates probabilistic entropy. Adding redundancy extends the maximum
> entropy; we live in a different (cultural) world after the next transition.
>
> For example, transportation over the Alps was first restricted by the
> passes such as the Gotthard and the Brenner passes. But using railways
> tunneling under the Alps or airplanes flying over them, the number of
> options is multiplied by orders of magnitude. The physical restriction on
> the ground are overcome.
>
> Redundancy generation can be measured in terms of (negative!) bits of
> information.
>
> The message is that there is no abstract "Logic in Reality" as a general
> systems dynamic, but this is itself a knowledge claim. The substance other
> than* res extensa*--that is,* res cogitans*--is not provided in data and
> to be measured in bits, but in terms of absences to be measured in -data.
> In other words, I disagree with Sungchul's positivism: meaning in
> inter-human communications is not objective, but intersubjective. It is
> provided with hindsight to the historical events and with reference to
> horizons of meaning.
>
> Best,
> Loet
>
>
> PS. Pedro, this is my first message in the new week. L.
>
> ------------------------------
>
> Loet Leydesdorff
>
> Professor emeritus, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; http://www.leydesdorff.net/
> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
> Sussex;
>
> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
> Hangzhou; Visiting Professor, ISTIC,
> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>
> Visiting Fellow, Birkbeck <http://www.bbk.ac.uk/>, University of London;
> http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
>
>
> ------ Original Message ------
> From: "Joseph Brenner" <joe.bren...@bluewin.ch>
> To: "Terrence W. DEACON" <dea...@berkeley.edu>; "Alex Hankey" <
> alexhan...@gmail.com>
> Cc: fis@listas.unizar.es; "Emanuel Diamant" <emanl....@gmail.com>;
> "Sungchul Ji" <s...@pharmacy.rutgers.edu>
> Sent: 1/14/2018 8:36:10 AM
> Subject: Re: [Fis] Response to Sungchul. Generative Logic
>
> Dear All again,
>
>
>
> Terry has introduced an absolutely essential concept on which we need to
> focus, that of a generative logic of informational relationships. I would
> just like to point out that we are not starting from zero. Some of us, for
> example Mark J. and I have already recognized the need for a new logic, in
> which understanding the dynamic relationships is central. In Logic in
> Reality, for example, Terry’s suggestion of the need to avoid “the
> tendency to use language-like communication as the paradigm exemplar” is
> already achieved by focus on the non-linguistic dynamic process properties
> of information.
>
>
>
> If Terry could expand *his *concept of the contours of a ‘generative
> logic’, it might be possible to show this even more clearly.
>
>
>
> Thank you and best wishes,
>
>
>
> Joseph
>
>
> ------------------------------
>
> *From:* Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Terrence
> W. DEACON
> *Sent:* samedi, 13 janvier 2018 19:33
> *To:* Alex Hankey
> *Cc:* fis@listas.unizar.es; Emanuel Diamant; Sungchul Ji
> *Subject:* Re: [Fis] I salute to Sungchul
>
>
>
> Hi all,
>
>
>
> I would be very encouraged if we are trying to develop beyond mere lists
> of different uses of the term 'information' TO structured taxonomies of
> distinct types of information TO a generative logic of how these distinct
> modes of a complex information relationship are interrelated.
>
>
>
> Dualistically distinguishing intrinsic properties of an informing medium
> from relational properties that determine its reference provides an
> important first step in growing the concept to encompas its full
> usefulness. But I hope that we will also eventually begin to attend to the
> functional value that the coveyed reference provides, since this too is
> often also implicitly part of the various uses of the term 'infomation' in
> colloquial and even scientific use. This requires more careful parsing of
> the term "meaning" that is often invoked.
>
>
>
> For instance, one can receive information that is unambiguously "about"
> something but where that which it is about is already known and therefore
> is "functionally redundant" (not to be confused with signal redundancy). Or
> this information can be about something that is irrelevant to a given
> function or end, while still being information about something.
>
>
>
> An example would be telling me the time when I already know what time it
> is. The statement about the time does indeed "mean" something—i.e. it is
> not meaningless as gibberish woiuld be. Similarly, if I ask to know the
> current temperature and I am instead told the time, the reference provided
> would be useless to me—i.e. it wouldn't "make a difference" in the
> colloquial English sense of that phrase. The concept of "meaning" tends to
> collapse or conflate these two distinctions—reference and
> significance—which I think we should endeavor to distinguish.
>
>
>
> In this respect I like the suggestion by Alex Hankey that we consider an
> example like the barely conscious "feeling" of being watched which both
> conveys information about an extrinsic state of affairs and additionally
> has a functional relevance which is implicit in the discomfort it typically
> elicits. Both the aboutness and the significance are relational, not
> intrinsic properties of information. They are are distinct relations
> because they are asymmetrically dependent on one another. Thus if I am
> entirely unaware of being watched I am nnot discomforted by it.
>
>
>
> Note also the difference in these relational attrributes: aboutness or
> reference is "in relation to" some state of affairs, whereas significance
> or value is "in relation to" some *telos* intrinsic to an interpreting
> agent or system.
>
>
>
> Exploring such nondiscursive examples can help us to escape the tendency
> to use language-like communication as the paradigm exemplar. The analysis
> of the information intrinsic to and conveyed by music might in this respect
> provide a useful platform for future discussion.
>
>
>
> Are there other critical distinctions that we additionally need to
> highlight?
>
>
>
> Happy New Year, Terry
>
>
>
> On Fri, Jan 12, 2018 at 9:24 PM, Alex Hankey <alexhan...@gmail.com> wrote:
>
> And what about the Kinds of Information that you cannot put in a data set?
>
> The information that makes you turn your head and meet the gaze of someone
> staring at you.
>
> No one could do that, which we humans and all animals do constantly,
>
> unless we had received such information at a subliminal level in the
> brain.
>
> We all have that capacity, it is vital for survival in the wild. All
> animals do it.
>
> The 'Sense of Being Stared At' is a common experience for most animals,
>
> how far down the tree of life no one yet knows.
>
>
>
> Whatever triggers it is definitely 'A Difference that Makes a Difference',
>
> so fits in your definition of 'Meaningful Information' - it has to!
>
> BUT IT CANNOT BE DIGITAL INFORMATION.
>
> Please Face Up to This Fact.
>
>
>
> All best wishes,
>
>
>
> Alex
>
>
>
>
>
> On 13 January 2018 at 07:30, Sungchul Ji <s...@pharmacy.rutgers.edu> wrote:
>
> Hi Emmanuel and FISers,
>
>
>
> Thank you, Emmanuel, for your generous remarks.  It is heartening to know
> that our ideas converge, although we carried out our research independently
> of each other, a clear example of consilience.
>
>
>
> (*1*)  I like and agree with the Kolomogorov quote you cited in [1]:
>
>
>
> "*Information is a linguistic description of structures in a given data
> set.*"
>
>
>
>
> It seems to me that there are 4 key concepts embedded in the above quote,
> which we may view as the definition of what may be called the "Komogorov
> information" or the "Kolmogorov-Bateson information" for  the
> convenience of reference:
>
>
> *i*)   data set (e.g., ACAGTCAACGGTCCAA)
>
> *ii*)  linguistic description (e.g., Threonine, Valine, Asparagine,
> Glycine)
>
> *iii*) structure (e.g., 16 mononucdotide, 8 dinucldotides, 5
> trinucleotides plus 1)
>
> *iv*) mathematical description (e.g., tensor product of two 2x2 matrices
> of 4 nucleotides) [2, 3].
>
>
>
> The first three elements are obvious, but the 4th is not so obvious but
> justified in view of the recent work of Petoukhov [2, 3].
>
>
>
> (*2*) Based on these ideas, I have constructed *Table 1* below of the
> various names applied to the two kinds of information which I described as
> I(-) and I(+) in my previous post.
>
>
>
>
>
>
>
> *Table 1.  *The *arbitrariness* of the signs referring to ‘information’.
> It doesn’t matter what you call it, as long as your chosen label refers to
> the right reality, thing, process, mechanisms, etc.
>
> 1
>
> Type I Information
>
> Type II information
>
> 2
>
> Physical Information
>
> Sematic information
>
> 3
>
> Shannon information
>
> Kolmogorov information, or
>
> Kolmogorov-Bateson information
>
> 4
>
> ‘Meaningless’ information
>
> ‘Meaningful’ information
>
> 5
>
> I(-) information, or simply I(-)
>
> I(+) information, or simply I(+)
>
> 6
>
> Quantitative information
>
> Qualitative information
>
> 7
>
> Mathematical information
>
> Linguistic information (see Statement (1))
>
> 8
>
> Formal information
>
> Phenomenological information
>
> 9
>
> Interpretant-less sign [4]
>
> Triadic sign [4]
>
>
>
> (*3*)  One practical application of the *dual theory of information *under
> discussion is in deducing the structure of cell language, or the
> structure of the linguistics of DNA, in a much more rigorous manner than
> was possible in 1997 [5].
>
>    It is the common practice in biology to use the terms "letters",
> "words", "sentences", and "texts" without any rigorous definitions.  The
> general rule is to follow the rules of concatenations used in linguistics
> literally and say that
>
>
>
> *i*) just as 26 letters in the English alphabet are combined to form
> words (the process being called the second articulation [5]), so the 4
> letters of the genetic alphabets, A, C, G and T/U,  combine in triplets to
> form genetic codons.  Similarly, just as words form sentences and sentences
> form texts by the same concatenation procedure (or tensor multiplication,
> mathematically speaking , i.e, linearly arranging words and sentences,
> respectively (see the second column in Table 2), so the 64
> nucleotide triplets combine to form proteins and proteins combine to form
> metabolic pathways by continuing the concatenation process, or the tensor
> multiplication of matrices of larger and larger sizes (see the
> fourth column, which is based on the physical theory of information, i.e.,
> without any involvement of* semantics* or the first articulation).
>
> *ii*)   In contrast to the fourth column just described, we can justify
> an alternative structural assignments based on the semantic theory of
> information as shown in the fifth column of *Table 2*.  Here the letters
> of the cell language alphabet are not always mononucloetoides but thought
> to be n-nucleotides, such as dinucleotides (when n = 2), trinucleotides
> (when n =3), tetranucleotides (when n = 4), penta-nucelotides (when n = 5),
> etc.  That is, unlike in human language where the letters of an alphabet
> usually consist of one symbol, e.g., A, B, C, D, E, . . . , *I am
> claiming that in cell language, the letters can be mononucloetides
> (i.e., A, G, C, T/U), dinucloeotides (i.e., AG, AC, . . . .) ,
> trinucleotides (i.e., ACT, GTA,  . . . ), tetranucleotides (i.e., ACTG,
> CCGT, . . . .), pentanucleotides (i.e., ACCTG, TCGAT, . . .) and, up to
> n-nucleotides (also called n-plets [2, 3]), where n is an unknown number
> whose upper limit is not yet known (at least to me). * If this conjecture
> turns out to be true, then the size of the cell language alphabet can be
> much larger (10^3 - 10^9 ?) than the size of a typical human linguistic
> alphabet which is usually less than 10^2, probably due to the limitation
> of the memory capacity of the human brain.
>
> (*iii*) From linguistics, we learn that there are at least 4 levels of
> organization, each level characterized by a unique function (see the second
> column).  Without presenting any detailed argument, I just wish to suggest
> that the linguistic structures deduced based on the semantic information
> theory (i.e., the fifth column) agree with the human linguistic structures
> (i.e., the second column) better than does the linguistic structures based
> on the physical/mathematical/quantitative information theory (i.e., the
> fourth column), when the functional hierarchy given in the third column is
> taken into account.
>
>
>
>
>
> *Table 2.  *Two versions of the linguistics of DNA based on (i) the
> physical information theory, and (ii) the semantic information theory [1].
> M stands for a 2x2 matrix whose elements are the 4 genetic nucleotides, A,
> C, G and T/U, i.e., M = [C A; T G] (see Figure 16 in [2]). The symbol, (x),
> indicates tensor multiplication [2, 3].  The I to II transition is known in
> linguistics as the second articulation; the II to III transition as the first
> articulation [4]; the III to IV transition was referred to as the third
> articulation [5].
>
> Organization  level
>
> *Human Language*
>
> *Cell Language*
>
>
>
> *Structure*
>
> *Function/Semantics*
>
> *Structure based on the Physical Information Theory (PIT) *[1]
>
> *Structure based on the Semantic Information Theory (SIT) *[1]
>
> I
>
> Letters
>
> Basic building
>
> blocks or basic physical signals
>
> 4 Nucleotides (A, C, G, T/U);
>
> M = [C A;T G]*
>
> mono-, di-, trinucleotides, 4-plets, 5-plets, . . . , n-plets of
> nucleotides,  . . .
>
> II
>
> Words
>
> To denote
>
> 16 dinucleotides;
>
> M(x)M or M^2
>
> Any combinations of the n-plets/ genes/proteins
>
> III
>
> Sentences
>
> To decide
>
> 64 trinucleotides /amino acids;
> M(x)M(x)M or M^3
>
>
>
> Assembly of  genes/proteins; or metabolic pathways (MP)
>
> IV
>
> Texts
>
> To argue/compute/
>
> reason (e.g., syllogism)
>
> 254 tetranucleotides;
>
> Metabolic pathways (?); M(x)M(x)M(x)M or M^4
>
> Networks of MP’s
>
>
>
> characterized by a unique function (see the second column).  Without
> presenting any detailed argument, I would like to suggest that the
> linguistic structures deduced based on the semantic information theory
> (i.e., the fifth column) agree with the human linguistic structures (i.e.,
> the second column) better than does the linguistic structures based on the
> physical/mathematical/quantitative information theory (i.e., the fourth
> column).
>
> In other words, the structure of cell language deduced based on the
> semantic information theory agrees better, functionally, with that of the
> human language than the structure of cell language deduced based on the
> physical information theory, thus further supporting the 1997 postulate
> that cell and human languages are isomorphic [5, 6].
>
>
>
> If you have any questions or suggestions for improvements on the above
> tables, I would appreciate hearing from you.
>
>
>
> All the best.
>
>
>
> Sung
>
>
>
> References:
>
>    [1] Emanuel Diamant, *The brain is processing information, not data.
> Does anybody care?, *ISIS Summit Vienna 2015, Extended Abstract.
> http://sciforum.net/conference/isis-summit-vienna-2015/paper/2842
> <https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fsciforum.net%2Fconference%2Fisis-summit-vienna-2015%2Fpaper%2F2842&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7C89f81861ee684f05e46b08d559d86fe1%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C1%7C636513708497810284&sdata=bMlZ324OoEHA5XMQibKiEFsm75NhcpkfIcSRUJbQZNg%3D&reserved=0>
>
>   [2] Petoukhov, S. (2017).  Genetic coding and united-hypercomplex
> systems in the models of algebraic biology. *BioSystems* *158*: 31-46.
>
>
>   [3] Petoukhov, S. (2016).  The system-resonance approach in modeling
> genetic
> structures. *BiosySystems* *139*:1-11.
>
>    [4] Ji, S. (2017).*Neo-Semiotics*: Introducing Zeroness into Peircean
> Semiotics May Bridge the Knowable and the Unknowable. *Prog. Biophys.
> Mol. Biol*.  *131*:387-401. PDF at http://www.sciencedirect.co
> m/science/article/pii/S0079610717300858?via%3Dihub
>    [5] Ji, S. (1997). Isomorphism between cell and human languages:
> molecualr biological, bioinformatic and linguistic implications.
> <http://www.conformon.net/wp-content/uploads/2012/05/Isomorphism1.pdf>
> *BioSystems* *44*:17-39.  PDF at http://www.conformon.net/wp
> -content/uploads/2012/05/Isomorphism1.pdf
>
>     [6] Ji, S. (2017).  The Cell Language Theory: Connecting Mind and
> Matter.  World Scientific, New Jersey.  Chapter 5*. *
>
>
>
>
>
>
>
>
>
>
>
>
> ------------------------------
>
> *From:* Fis <fis-boun...@listas.unizar.es> on behalf of Emanuel Diamant <
> emanl....@gmail.com>
> *Sent:* Friday, January 12, 2018 11:20 AM
> *To:* fis@listas.unizar.es
> *Subject:* [Fis] I salute to Sungchul
>
>
>
> Dear FISers,
>
>
>
> I would like to express my pleasure with the current state of our
> discourse – an evident attempt to reach a more common understanding about
> information issues and to enrich preliminary given assessments.
>
> In this regard, I would like to add my comment to Sungchul’s post of
> January 12, 2018.
>
>
>
> Sungchul proposes “to recognize two distinct types of information which,
> for the lack of better terms, may be referred to as the "meaningless
> information" or I(-)  and "meaningful information" or I(+)”.
>
> That is exactly what I am trying to put forward for years, albeit under
> more historically rooted names: Physical and Semantic information [1].
> Never mind, what is crucially important here is that the duality of
> information becomes publicly recognized and accepted by FIS community.
>
>
>
> I salute to Sungchul’s suggestion!
>
>
>
> Best regards, Emanuel.
>
>
>
> [1] Emanuel Diamant, *The brain is processing information, not data. Does
> anybody care?, *ISIS Summit Vienna 2015, Extended Abstract.
> http://sciforum.net/conference/isis-summit-vienna-2015/paper/2842
> <https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fsciforum.net%2Fconference%2Fisis-summit-vienna-2015%2Fpaper%2F2842&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7C89f81861ee684f05e46b08d559d86fe1%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C1%7C636513708497810284&sdata=bMlZ324OoEHA5XMQibKiEFsm75NhcpkfIcSRUJbQZNg%3D&reserved=0>
>
>
>
>
>
>
>
>
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
>
>
>
> --
>
> Alex Hankey M.A. (Cantab.) PhD (M.I.T.)
> Distinguished Professor of Yoga and Physical Science,
> SVYASA, Eknath Bhavan, 19 Gavipuram Circle
> Bangalore 560019, Karnataka, India
> Mobile (Intn'l): +44 7710 534195 <+44%207710%20534195>
>
> Mobile (India) +91 900 800 8789 <+91%2090080%2008789>
>
> ____________________________________________________________
>
>
>
> 2015 JPBMB Special Issue on Integral Biomathics: Life Sciences,
> Mathematics and Phenomenological Philosophy
> <http://www.sciencedirect.com/science/journal/00796107/119/3>
>
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
>
>
>
> --
>
> Professor Terrence W. Deacon
> University of California, Berkeley
>
>
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
>  Garanti
> sans virus. www.avast.com
> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
> <#m_4912495825080002806_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
>
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>


-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to