ECOS 2026: Professor Enrico Sciubba, you are well known for your contributions to Applied Thermodynamics or Engineering Thermodynamics. An example of methods that have been developed over the years was coined as Exergy Modeling, Analysis and Optimization where you and many other scientists have contributed and the results were outstanding. In the last decade, there is a growing attention towards Data Driven approaches in science. What is in your opinion about the relation between Data Driven Approaches and Model based Approaches in Applied Thermodynamics?
E.S.:
Well, let’s begin with the “official” definition of Data Driven Approach to Design (DDAD): “a systematic method that uses data to guide every stage of the design process, replacing intuition or assumptions with evidence”. Let me provide an articulated answer:
- Historically, human designers have always used past experimental evidence to perfect their work or to concoct new processes and structures: this is why you can usually tell the difference between the work of a novice and of a seasoned designer. DDAD has systematically enlarged the breadth of the available “experience”, and this is -oddly enough- more useful for expert designers, who can quickly seep through past solutions.
- The exponential power increase in data storage/retrieval has generated what is called information overflow: the designer is confronted with so many data that often appear scarcely related to each other, with the result that “more data” means “more confusion” and adds uncertainty to the decision.
- AI has fundamentally improved the situation: current tools can “intelligently” run through the data and identify correlations, connections and even formulate quantitative preliminary suggestions. A distinction is necessary here: the so-called “machine learning techniques” are based on Artificial Neural Networks that are said to “mimic the human brain”. In practice, ANNs are 3-dimensional virtual structures consisting of sets of 2D matrices of properly interconnected switches (the “neurons”), in which each switch receives an input from 1 or more of its upstream companions and “fires” a modulated signal to 1 or more of the downstream ones (in some instances, feedback is used, and some nodes send “back-signals”). Thus, in spite of the astounding mathematical complexity of their operation, ANN simply build an “answer” (the “solution”) somehow correlated with the “input” (the “problem”). To make sure that the answers are reasonably accurate, ANN are trained by providing a certain unput and tuning the internal connections until the output is what we expect it to be. The more extensive the training, the higher the probability that the answer to a real (=not previously tested) input is in some sense correct. How would we define such a device? In my mind, this is a deterministic machine whose outputs are possibly quite close to the “correct” solution with an error probability that is (a) unknown a priori and (b) depending on the type and amount of training. More importantly though, even if the input/output correlation is correct in some norm, we cannot know why: physical reasoning is completely out of the picture here, and an ANN would tell you that, yes, pv for a certain gas is almost always equal to about ζRT, but it cannot tell you why. (You can see I am not a fan of ANNs… though I am usually a loser in the many and much heated discussions with my more expert colleague Roberto Melli who is instead their strong supporter…). Lately, a new tool has been added to DDAD, the so-called “Deep Thinking” machines (DT), basically Inference Engines (IE) that use parallel or multi-mode logical inference (“IF p, THEN q”) to solve complex problems. Their IEs work by exploring multiple logical paths simultaneously or switching between different models to find the best solution for complex design- and research tasks. Great, yes? Well, not quite: solving a problem related to the deciphering of a Hittite document or designing a scramjet require different knowledge, different backgrounds and -most importantly- a different thinking mode. To witness, just put at the same table a space engineer and an archaeologist and let them discuss a novel topic external to their respective fields… As a consequence, the IE chosen to solve a certain problem must possess a topic-specific logic, so it has to be selected by another IE that must decide what the general field of application is.
The AI I have in mind reflects this distinction and is less “general” and more “reductionistic”. Each field has its specific Knowledge Base (KB) and the IE acting on it must be guided by some topic-specific rules in addition to the general logical paradigms.
ECOS 2026: Your are one of pioneers of promoting digital tools in Applied Thermodynamics and engineering in general. Many of us remember about the COLOMBO and CAMEL Platforms that also inspired many other similar applications. Today, the unprecedented evolution of very powerful tools as LLMs. SLMs, Multi-Agent Platforms, RAGs, MCPs and many others, tend to change the profession of engineer. How you advice your students to address digital literacy in their future carrier?
E.S.:
I insist that they first formulate the problem in terms of prime principles and then write (or set up) the code in an object-oriented fashion, because this will provide more programming flexibility and also will help them debug it at a later stage. Then to examine the solution with a very mistrusting attitude, again checking its physical credibility. This is especially important in Exergy and Thermo-Economic Analyses.
ECOS 2026: Could you reveal the topic of your Keynote Lecture at ECOS 2026?
E.S.:
I have not yet finalized it, but I am thinking of discussing some theoretical points on the connection between Exergy Analysis and Sustainability Assessment, and possibly present some applications to realistic cases.
ECOS 2026: At ECOS 2026, we shall start a new dimension of ECOS Conferences by introducing ECOS Short Certificate Courses. What is the topic of the short certificate course that you are preparing? What is the profile of the target participants for the course?
E.S.:
Well, Extended Exergy Accounting, of course… and the prerequisites are Thermodynamics and Engineering Economics (students with knowledge of Thermo-Economics would find the course quite easy but will be surprised by some new point EEA is raising about Sustainability)
ECOS 2026: What is your message to the participants of the ECOS 2026 Conference that will be organized next year in Constanta, Romania?
E.S.:
To the Students and Colleagues attending for the first time: ECOS is an example of success of a different way of approaching the idea of “An Engineering Conference”. With stubbornness, personal sacrifice and a strong belief in scientific values, the small group of 10-15 people who first met in 1984 in New Orleans at the Advanced Energy Systems group under the leadership of Richard Gaggioli has grown to an international organization uninterruptedly holding yearly conferences in 5 continents (Australian colleagues, we’re waiting…). The quality of the presented papers is of high standards; innovative research has stemmed from the cooperation initiated at these conferences; Agencies and Industry have benefitted from our research, new Master- and Doctoral level courses have been born out of material discussed in our Panels… just keep up the ECOS torch! Just do what we did for the past 42 years: listen, read, think, propose, discuss, teach, publish, fight…
Prof. Enrico Sciubba is a well known professor on engineering, thermodynamics and turbomachinery. Between 1973-2019 he was a professor at the Sapienza University of Rome. At present he is a professor at the Niccolò Cusano University and Secretary General of the ECOS International Society. For his entire career the Senate of Ovidius University of Constanta bestowed him Doctor Honoris Causa and appointed him as the chair of the International Scientific and Innovation Coordination Committee of the Institute of Nanotechnologies and Alternative Energy Sources.

