Science is naturally conservative and the scepticism to new ideas ensures high scientific quality. We have more confidence when different scholars arrive at the same conclusion independently of each other. But scientific research also brings about discoveries and innovations, and it typically takes time for such new understanding to receive acknowledgement and acceptance. In the meanwhile, it’s uncertain whether they really represent progress or if they are misconceived ideas. Sometimes we can shed more light on new ideas through scientific discussions.
I recently experienced the contrast between old knowledge and new ideas when my research group used a well-established mathematical concept in a new way. In this case, the mathematical concept is related to so-called eigenfunctions, but in climate research known as empirical orthogonal functions (EOFs) and discussed earlier in a post here (‘Why not use a clever mathematical trick?’). I decided to elaborate on this idea through a discussion paper in EGUsphere with an open review. It will hopefully lead to some scientific discussions that may tell me whether our new ideas represent a progress in terms of evaluating climate models.
A motivation for this discussion paper was that it came as a surprise to me that it’s rare that EOFs are applied to joint datasets, known as ‘common EOFs’. For instance, common EOFs are absent in the community-based ESMEvalTool (Eyring et al., 2020) despite their merits, something we discuss in our recent discussion paper (Benestad et al., 2023). Furthermore, text searches with “common EOFs”, “common principal component” and “common empirical orthogonal functions” through the full Working Group 1 report from the IPCC assessment report 6 (2409 pages) only gave one hit:
Predictor patterns that are common to observations and climate model data can be defined by common empirical orthogonal functions (Benestad, 2011).
On Google Scholar, the same search gave 116, 1680, 64 hits, and in this case, the higher number of hits for “common principal component” involved many more disciplines than climate research.
So why bother with common EOFs? Bernhard Flury wrote a book in 1986 with the title ‘Common principal components and related multivariate models’ (Flury, 1986), and a few other studies picked up the idea, such as a report from 1993 and in Sengupta and Boyle (1998). I think common EOFs are a clever concept and have used them as a framework for empirical-statistical downscaling since 2001 (Benestad, 2001). In 2017, a group of colleagues and I wrote a perspective on the use of statistical methods in climate research where we also discussed common EOFs (Benestad et al., 2017).
In the latest discussion paper, we highlighted several ways of using them to deal with large multi-model ensembles and huge data volumes, popularly known as “Big data”. They make use of the redundancy of information embedded within the data and boil it down to the most salient aspects. Also, common EOFs make it possible to take advantage of some attractive mathematical properties such as orthogonality. It is somewhat similar with Fourier series which too provide orthogonal components in addition to the ease to estimate any derivative and makes it simple to estimate the length of e.g. solar cycles (Benestad, 2005).
So far, I haven’t seen any arguments against the use of common EOFs. So why have common EOFs not been used more?
References
- V. Eyring, L. Bock, A. Lauer, M. Righi, M. Schlund, B. Andela, E. Arnone, O. Bellprat, B. Brötz, L. Caron, N. Carvalhais, I. Cionni, N. Cortesi, B. Crezee, E.L. Davin, P. Davini, K. Debeire, L. de Mora, C. Deser, D. Docquier, P. Earnshaw, C. Ehbrecht, B.K. Gier, N. Gonzalez-Reviriego, P. Goodman, S. Hagemann, S. Hardiman, B. Hassler, A. Hunter, C. Kadow, S. Kindermann, S. Koirala, N. Koldunov, Q. Lejeune, V. Lembo, T. Lovato, V. Lucarini, F. Massonnet, B. Müller, A. Pandde, N. Pérez-Zanón, A. Phillips, V. Predoi, J. Russell, A. Sellar, F. Serva, T. Stacke, R. Swaminathan, V. Torralba, J. Vegas-Regidor, J. von Hardenberg, K. Weigel, and K. Zimmermann, "Earth System Model Evaluation Tool (ESMValTool) v2.0 – an extended set of large-scale diagnostics for quasi-operational and comprehensive evaluation of Earth system models in CMIP", Geoscientific Model Development, vol. 13, pp. 3383-3438, 2020. http://dx.doi.org/10.5194/gmd-13-3383-2020
- R.E. Benestad, A. Mezghani, J. Lutz, A. Dobler, K.M. Parding, and O.A. Landgren, "Various ways of using Empirical Orthogonal Functions for Climate Model evaluation", 2023. http://dx.doi.org/10.5194/egusphere-2022-1385
- R.E. Benestad, "A comparison between two empirical downscaling strategies", International Journal of Climatology, vol. 21, pp. 1645-1668, 2001. http://dx.doi.org/10.1002/joc.703
- R. Benestad, J. Sillmann, T.L. Thorarinsdottir, P. Guttorp, M.D.S. Mesquita, M.R. Tye, P. Uotila, C.F. Maule, P. Thejll, M. Drews, and K.M. Parding, "New vigour involving statisticians to overcome ensemble fatigue", Nature Climate Change, vol. 7, pp. 697-703, 2017. http://dx.doi.org/10.1038/NCLIMATE3393
- R.E. Benestad, "A review of the solar cycle length estimates", Geophysical Research Letters, vol. 32, 2005. http://dx.doi.org/10.1029/2005GL023621
Russ Doty says
After all the pushback you got against using the description of your use of the word “trick,” one would think you would adopt a more descriptive word that did not open climate science up to denial criticism.
Susan Anderson says
I think we need to stop letting cheap shots by “opposition research” define us. I hope but doubt this will display, but if not the general idea is that the word “trick” has been exploited to deceive.
TrickSTRIP_MarcRoberts_2009.jpg
Susan Anderson says
One of these will find the actual cartoon. The second is the image which might just appear here. I’m afraid Marc Roberts has disappeared (any info on his location would be welcome):
http://throbgoblins.blogspot.com/2009/ – here it is ->
http://3.bp.blogspot.com/_2fgn3xZDtkI/SziXnfjHy1I/AAAAAAAACxE/cjWviqrQGzM/s400/TrickSTRIP(MINI).jpg
Rory Allen says
Perhaps instead of ‘trick’ one could use the word ‘tool’. That way, the word could be used twice over, once for the mathematics, and once to describe those in “opposition research” who indulge in cheap shots.
RodB says
Very thoughtful and interesting. Maybe something will come of it. Maybe it is not used much in climate science because it is not a widely understood mathematics working on a not fully understood science model.
Paul Pukite (@whut) says
Based on how other scientific and engineering disciplines apply the ideas, eigenfunctions and EOFs are likely overrated in their importance. An eigenfunction (or eigenvector) is the root of a differential equation (or multi-dimensional DiffEq). This means that it’s the natural response to an impulse function, yet rarely does that arise in climate behaviors. Think about all the natural variations we see in climate — the daily temperature cycle, the annual or seasonal cycle — these are not eigenfunctions but are strictly the forced response to an external cycle, with only the shape (i.e. decay, damping, etc) of the response governed by the DiffEqs. Another aspect of eigenfunctions that limit their practicality is that there may be an infinite amount of solutions, especially in the spatio-temporal realm. Often what constrains the number of solutions is the boundary conditions, so that the standing wave-modes are what limit (and potentially amplify) the response as a consequence of constructive & destructive interference of travelling waves. And again, this may only be incidentally related to an eigenfunction as the size of the bounding volume or waveguide define the spatial wavenumber.
As an example of where these concepts come together is in tidal analysis. Ocean tides are the fluid dynamics response to lunar+solar forcing, first formulated by Laplace in 1776. Now, one could solve Laplace’s tidal equations (LTE) for all the eigenfunctions (see Hough functions), but no one does that in practice. Instead what tidal analysis does is collect the lunar+solar forcing cycles and harmonics and factor the amplitudes to a measured tidal gauge time-series data set. That becomes the forced response and is used to do tidal prediction. There may be cases where the bounding volume is such that the tidal cycle constructively interferes with itself and tides are amplified, as in the Bay of Fundy. However that is not a resonant cycle that differs from an expected lunar or solar tidal constituent period, just as an auditorium doesn’t change what an orchestra playing Bach fundamentally sounds like. Some resonance frequencies may arise but that’s not the predominant response.
I personally have never used EOFs when doing my own ENSO analysis. It’s very straightforward to calibrate the tidal forcing to length-of-day (LOD) measurements and create a transfer function that maps the input forcing to an output time series such as NINO4 — like with tidal analysis, a k-fold cross-validation is all that is required to demonstrate the parsimony of the approach. What pops out are the nonlinear amplifying factors corresponding to the standing wave modes of the ENSO dipole.
The key to this analysis is that even though the orchestra may be playing Bach, once the LTE nonlinear solution is applied, it no longer sounds like Bach. That’s likely why no one could predictively map out ENSO as we can easily with tides since the nonlinear response scrambles the input to only vaguely resemble the output.
As Rasmus said above “Science is naturally conservative and the scepticism to new ideas …. typically takes time for such new understanding to receive acknowledgement and acceptance.” Something new is indeed required and although this RC post is on the right track, it’s on the wrong railroad. The common EOFs required may be a common-mode tidal forcing. Please look into that, I’ve published several times since 2016. Thanks for at least pushing for something new, the only way any progress will be made.
rasmus says
[Thanks for your comments, Paul. I’m not sure how to interpret them, though, as my point is that common EOFs actually work very well. -Rasmus]
jgnfld says
In some areas of stats these techniques are used to calculate the “canonical correlation” from the eigenvalue and the eigenvectors then define the various weights each variable contributes to the common variance.
Calculating a canonical correlation is fairly easy. _Interpreting_ the factors sensibly is a far harder job. But sometimes it can be done.
Paul Pukite (@whut) says
That’s a good point. Often these “eigenfunctions” or EOFs are not true roots but heuristics that seem to work … until they don’t. That’s essentially the definition of a heuristic — something that does the job of fitting some data without a basis in physics. If it ceases to do the job, time to move on to a new heuristic as there was no physical basis in the prior one.
Carbomontanus says
Hr Benestad
I find it quite difficult to follow you here, having been in that situation of established ground and new ideas several times with pioneering works where FACIT is not yet written.
And have to agree more and more with Max Planc, who said that “Die Wissenschaft schreitet mit den Bestattungen fort!
Which is so, not because science is conservative.
but because blunt pedantic unqualifried and rigid ” Forkalkede- Calcified..” old Piggy Professors from the Party , (oPPP) have sneeked into the chairs and positions year by year and taken over the very shop,
To their own advantage and rents, in their own anti- scientific style.
This is not because science is “conservative” but because those obviously unqualified conservatives sneak in and take over the very periodicals and institutes.
=======000
Further on Eigenfunctions and ortogonal behaviours.
I have made my own set of conscepts there. Your thoughts and your metyhods and your coordinate systems should be PHAENOMENOLOGICALLY CONGRUENT, and you should look better up for AD- ÆQVATE PARA-METERS.
Quite often, new tools and methods for it has to be invented because it is not invented yet, and not mass produced for sale in the shops, old Peculiar Professors (oPP) fear that perhaps most of all because they are sons of the bitches in the classicalo manufacturing industries where everything is prooven and understood for the workers and given out from Upstairs. .
On Phaenomenological congruence: ,…. “Good to0ols do good work, the shoemaker remarked, he ate the soup with his awl!” Moral: For broad soups rather take to broad phaenomenologically congruent spoons.
And do not measure and map the electromatgnetic system minutely with caliper in tenths of millimeters on all its irrelevant surfaces, Measure it in ohms ampere volts and seconds, And judge rather amperewindings over a laminated, closed cirquit of iron. Further, take to the historically correct tools, Edisons incadescent test- lamp and that old turning coil galvanometer with a pointer and a tiny spiral spring, air damped, on the Electromotor, Because that fameous measuring device called ” A multi- meter..” is phaenomenologically congruent to what you are examining, and quite contemporary historically correct.
If you rather do that, then the sun goes up over the given electromagnetism, and it becomes quite especially easy and comprehensible.
Else you see but a worms nest inside of the Electromagnetic dynamomachine and ask for your statistics and confidence as learnt, and feel very adult, professional, sceptic, and scientific that way..
Then for orto- gonality.
When did the Universe become orthogonal?
When did Matter, Reality, Life, Politics,… the Climate….. become orthogonal?
( because you ought to have ideas and thoughts and opinions that are congruent to what you are examining measuring and discussing)
To my experience and knowledge things, realities and horizons may as well be unique and simply straightforward. But, , they may also be doubble, and they are quite often tri- gonal, if not pentagonal, hexagonal,………….. even heptagonal in some rare cases.
Whereas Ortho- gonal mappings, conscepts, and thoughts are clearly artificial imperialistic, and less appliciable in most cases.
Orthogonal thoughts and performance is rather due to the squareheads and is hardly natural, scientific, and appliciable without violence to Data and to Reality.
rasmus says
[Yes, I think I lost you. My point is that common EOFs work, as you can see from the discussion paper and its supporting material. -rasmus]
Carbomontanus says
Hr Benestad
Thanks for your tolerant reply.
I have looked after and beginning to see what “EOF” perhaps migtht be.
It is very different from what I might use of methods especially because I havent got the necessary computer methods, and trained to help myself in other kinds of way by methods that I have access to.
But also because it is hardly in my facultary traditions, where digitalization is hardly mainstream conventional.
But I can well immagine that it is necessary for modern regional and global weather forecasts for instance, where there is an obvious mixture of chaotic and rather coherent and laminar processes. That is not new to me at all because I have been sailing traditional open 17`boat across and between the breakers.
I discuss it in terms of CHAOS and CHOSMOS, both are natural and real together.
Unpredictability is also a natural reality.
I have developed oscilloscopical and oscillographical methods for musical acoustics and musical instrument design, that is complex and higly refined pnevmatic oscillators to high accuracy.. Also there, you “sail” dynamically at the edege of when the tone is breaking, the laminar scomplex sound figure splashes and collapses. We find it again in finely tuned radio receivers and transmitters, as in the Chladini experiment also, when we drive it too hard on tricky complex tones the coherent laminar wave figure runs into conflict with its endconditions..
I found Fourier analysis to be mentioned as similar to “EOF”, and there I can tell quite a lot more.
Fourier had got pythagoreanism on his brains and hardly wrote about anything else. We find that again in Helmholz phonetics that has become compulsary for general linguistics that discusses
spoken vowels in minute detail. “The rain in spain falls gently in the plain…!” But that is hardly Italian opera where vowels (sound colours figures) are much better developed,
I am using the RC- couppled oscilloscope where pure sinus becomes a ring, and normal musical tones anything but a ring. A complex system of stable coherent extraloops to the pure ring..
There One also find the CHAOS…. sheere ugly noise, and partial beat that is called rumble, burble, and Wolf im Ton.
All that is essencial to weather forcast, but in fine music,…. rather avoid it. As in fine turbines and aeronautics,… make it as regular and laminar noiseless as possible, avoid turbulence and fine performance disaster..
As for Fourier analysis, I have put together resonant receivers with tuneable frequency and positive feedback,, that is some of the most sensitive detection methods of all. By that, one can pick out and map the waveforms in space,…0f singular partial frequencies in a laminar and whole numeral coherent harmonical “tone”.
But then, by absolute chaos, “white noise… termal noise”,…….. there is hardly any patterns or partials or special monochromatic frequency signals at all. The analytic receiver device will only receive and transmit further its own minutely tuned frequencies over a continuous scale..
For that kind of signals, fouriers theory looses its appliciability. The phantastically sharp sensitive method only shows itself and nothing else, regardless of input.
It goes into “spin” we say.
( Like Ellestad Humlum Solheim on climate cycle histories)
We can further exel in examples and patterns where fourier analysis is rather a less fruitful method for finding “coherence” and macro- physical patterns.
Take a picture of Oslo city for instance, and draw it minutely up as a linear function all over. Fourier will just be a dilettantic explaination and theory of that, Fourier analysis hardly applies to any given function and curve as often told., rather go an get a map from the tourists office, that will be a much better scientific model theory of oslo..
All the way, the measuring and analytical methods must be PHAENOMENOLOGICALLY CONGRUENT to what you are trying to find out. Because, You will hardly find the city hall, the royal palace, the parliament and the central station by Fourier analysis of Oslo town..
This is also a timeless discussion of epistemology where experts and philosophers at any time cannot agree..
Pattern or coherence or consequence- recognition seem only possible if ideas, CONSCEPTS are pre- existing . Given in advance as potencial conscepts that can be activated- made aware of, re- minded of .at least.
It sits obviously down into our evolutionary genetic heritage, since this rules indeed for animals also. Mind and consciousness seems to be a vital and natural, potencial reality that should be conscidered first.
The total lack of conscepts and ideas first makes one totally helpless unconscious. That seems to rule for animals allready,
And people do not agree to this..
but
Til visse, Peer Gynt, i anelsens Mangel
Har Manden med Hoven sit bedste Angel!
SANN!
jgnfld says
In various statistical dimensionality reduction techniques orthogonality is a logical and helpful convenience but it is not a necessity by any stretch of the imagination. Nonorthogonality is straightforward to include. See various oblique rotation techniques which are long established procedures from long before the computer era . (The oldsters here will remember the old research assistantship job of running sums of squares and cross products on big 10×10 even 12×12 mechanical calculators 4 hours a day that barely paid for beer money.)
Given the difficulties some here have found dealing with multiple orthogonal causes acting to produce the results I have strong doubts these posters can handle nonorthogonal multiple causality
Carbomontanus says
jgnfld
Multiple causality and multivariable systems are not caused by or due to the methods by which we pick into them. Rather on the contrary, they may be disturbed contaminated and destroyed, ruined by such operational behaviours however adult, learnt, advanced, and professional.
SIR Arhur Eddington, the man who understood the sun, published on this shortly before he died. in 2 fameous books of 1942- 43 “Philosophy of physicalmn science” and “the nature of the physical world”.
Eddingtons contribution is said to be eqvivalent to Kants Kritik der reinen Vernunft in regard to 20 ieth century atomism and “quantm physics” , Bohr and Heissenbergs basic principles of uncertainty.
In that, human mind and premises of understanding anything at all, must be studied and understood as a scientific method and device that also disturbs and mixes into and contaminates what is to be measured, studied, and understood. Man and human mind, scientific reason and praxis ….,… is an inevitable part of the whole and must be studied and understood also together with it…
CREDO IN the desktop computer at hand with its “statistics” and necessary arbitrary digitalization, orthogonality and sums of the squares and square roots of that where there are no squares and roots in natural reality….. is such an artificial disturbance and contamination…. maybe major systematic error…. to the given sample material , of given nature, climate, and given reality..
If there are elephants in the room and you phantacize in terms of squares and roots and orthogony instead,… maybe you are quite lost and will never find the truth,….
Did God create man in his own fashion,…. or did man create God in his own fashion?…. that is an old and unresolved question, that the philosophers cannot agree on.
Psychical or mental projections, whishful thoughts,…. we call it.
Be aware of that..
Brian Mapes says
I always taught this as “combined EOFs”, if you mean maximum covariance modes of different fields (variables). If the fields have different units, they must be standardized. Oh wait, no, this adjective “common” (a very inexpressively common word) apparently means “two or more datasets combined on a common grid along the time axis”. Different datasets, does that mean different variables? Does “combined on” refer to the word “grid” (which sounds spatial), or to the “time axis” (and is that being called a “grid”)? All these inexpressive, ambiguous wordings forced me to read the paper just to find out what very uncommon (rare) tactic is being labeled as “common”. Even there, there is no diagram nor mathematical spell-out like EOF(x)*PC(t). The words “maximum covariance” (which all these analysis techniques really are aiming at) do not appear. Without understanding what a technique is maximizing, minimizing, or optimizing the meaning seems buried in conventions rather than virtues or goals. So maybe wording and labeling is part of the reason this time-encoding trick, for pooling heterogeneous data series whose timelines are arbitrary or not synchronized, has not become popular enough to wins the familiarity rights to a common word like common?
Carbomontanus says
B Mapes
I have the impression that this is what they have been doing for a while, “Homogenization of meteorological time series and data” so they are ” made common..” and can be integrated in larger scale complex models and analyses.
So I feel sure that they are aware of it.
It is very critical also because they are to judge and discuss fractions of degrees per decade, built from local Data where temperatures may normally swing dozens of degrees daily weekly and annually…. and taken with more oir less numb and frozen fingers….
This is also an aspect that has been criticized toughly from denialist and “sceptics” side. “how can you predict so sure the climate i a century when you cannot even tell for sure the weather for next week?”
To that, comes also suggestions of fameous systematic errors such as urban heat islands and a CO2 vulcano next to the Mahuna Loa CO2- station for the keelingcurve. And we have seen discussions of varying “conventional” procedures for measuring seawater temperatures on high sea.
All this together is why I have been quite critical to that digitalized desktop computer model statistic work with “confidence” here,
and rather pointed at obvious needs for online scientific instrument measurement and experimental innovation and instrument design for perhaps more “phaenomenologically congruent” datalogging. such as “What is it really after all that we are dealing with? and why?, for what purpose?”
Models and statistics on models where data are allready transformed and trans- duced will hardly uncower much of that.
I allways had to invent the way and the devices when having to do such work where FACIT is not yet written, because the tools were not invented yet and not mass prouced for sale in the shops.
jgnfld says
Yes, As I and others have mentioned, Running the equations is trivial in these days of computing. _Interpreting_ _the_ _results_ spit out by these techniques in a sensible and reproducible way is usually–[in my experience is ALWAYS]–another thing entirely.
But when they do come out sensibly and reproducibly, they can be very interesting and powerful.
Paul Pukite (@whut) says
Common most often implies common-mode, as in an underlying mechanism that is common among different behaviors. I have a comment above where I describe how all tidal factors are common, just that the amplitude and phase may change depending on the location.
Carbomontanus says
Yes, maybe.
But your example, tidal factors are common allthough amplitude and phase,…. and noise to it… may change depending on location…. tells us that it is often quite importand to have an idea also about it of what to search for namely such an eventual common tidal factor.
That may be very well hidden indeed in all the complex signal mess.
I can especially remember a radio listening situation on shortwave. In all the noise and mess I heard something…… Could it be? … vhich language….? Yes really….a danish fisherman,… and danish is not the easiest. on the contrary, even danish children have their difficulties so it takes them some morev time than in other languages to learn it. But, when I “got it”, then I further could hear all he said.
Linguistic and audical mind “repairs” the holes in the signal by ideal understanding if the meaning of it is “on the roller”. Then it gets better in.
That is so everywhere in sensual physiology and possible pattern recognition and signal underswtyanding. The LOGOS, the meaning and idea of it must be there first before you really get it. No ROBOT has yet got that scill. It is the most basic property of meaning and mind and consciousness.
You hardly see a faintest star either befor you know consciously that perhaps it is there.
And you supress just as efficiently things that ought not to or should not be there.
Thinking seems not possible if not a bit whishful. ” the will of understanding must also be there”.
If you somehow hate that lecture or that LECTOR, you hardly listen either. and you hardly get it.
Carbomontanus says
@ Benestad & al
Permit me to disturb you again
This question really interests me because I have been up to it so many times. Breaking with the orders and routines to get things done.
If the mainstreams are working alltogether on why does it break and find no good answer, then turn the question around and ask why does it keep?
Or more orthodox, if you cannot proove it, then try rather and disproove it. Orthodoxy and mainstream may have struggled all those years because they have stated the problem in an inadequate and clumsy way. The easy elegant solution may then be painful to orthodox mainstream indeed. And thus refused by peer rewiewers. Especially by anonymeous ones, who never stated and solved any problem outside of their given factory and FACIT.
But we often seek solutions to problems where FACIT is not written yet.
I found a large book under the christmas tree in highschool, Bertrand Russels “wisdom of the west”. That showed very practical in the disputes with the Sovjetophiles. Their so called “dia – lectic materialism” as a basic philosophical and scientific method has been rentlessly ridiculed and ruined by SIR Bertrand.
But later I have come to think. Whitehead Russels Principa Matematica believed to be the bible and catechism of 20ieth century logics, …. then who is Alfred North Whitehead? who was the 1.st author and Russel only the 2nd?
Russel notes that Whiteheads philosophy is nebulous and may be similar to Leibniz Monade- theory. ” but as he began denying Gallileis learnings of primary and secondary sensual qualities. ” I can no more follow my earlier colleague!”
The Antroposophers also deny the same,… to the dis- adevantage of their possible science, I know. .
And the Pope, Pastor Vojtyla from Krakow has recently declared the process against Gallilei for erroreous, sinful, and deeply regrettible… on behalf of the church ande all the saints and in all languages!
( Pastor Vojtyla from Krakow was an un- improoveable Polak and Copernican you see). .
But maybe we loose something after all,
So I have checked up Whitehead more closely. Whitehead did deny and disqualify a lot of things to an exstent that made him have to emigrate to the USA and teach further over there in the States.
Whitehead seems to have disqualified basic conscepts from Descartes, Newton, and John locke, that make up the basic fundament of logical empiricism.
Instead he has introduced something called “process philosophy”. A thing or an object hardly exists in itself, and does not represent basic or primary existance in any case. Things exist only in their relation to other things and really what exists first of all is that process of interaction and relations.
It is hard and even impossible for me to think and to work that way. I need to know that the proton and the electron exists first of all and that there are furher chemical or material ground elements independent of any processes and relations.
To check it up, I discussed it with colleague R. Finsrud, the fameous artist and Perpetuum mobile- maker in Drøbak ( a very scilled mechanician artist and magician) who said spontaneously that what exsists first of all is the process and things relations.
Two days later, he regretted it and could agree that we need both bricks and the co- existance and glue and functional connections between them.
But I come to think, and this is my point this time, that as the philosophers seem not to agree on theese things, maybe some of them here who apparently deliver surrealisms and unpopular relational absurdisms…. that they are the whitehead type of mentality, who never could follow and enjoy logical empiricism in its orthodox mainstream fashion
They may just be . quite less talented and trained on it as compared to Alfred North Whitehead.