Discussion Central

Expand all | Collapse all

Are you using Artificial Intelligence?

  • 1.  Are you using Artificial Intelligence?

    Posted 21 days ago
    I know that many of you are using artificial intelligence in your work as you solve tough technical problems. Where have you found AI to be most useful? And what platform are you using?

    Dan Lambert
    Savannah River National Lab

  • 2.  RE: Are you using Artificial Intelligence?

    Posted 21 days ago
    Dear Dan,
    The answer is yes! Artificial intelligence has become rock star in the scientific field these days. It is widely accepted in computer science and information technology. Until last year, I have noticed how important of AI to the field of chemical engineering. It can solve differential equations (Piscopo, Physical Review D, 2019). It can solve some non-interpretable problems such as wastewater treatment (Tan, Process safety and environmental protection, 2018). I also published a paper on AI namely "Solving anaerobic digestion with neural network: application to high concentration organic wastewater". Unfortunately, the paper is written in Chinese or I will send it to you. Whatever, I am working on implementing AI into chemical engineering. Lucky to meet you. Hope to share idea with you later.

    Ming Zhu
    Nanjing Tech Univeristy

  • 3.  RE: Are you using Artificial Intelligence?

    Posted 20 days ago
    What programming language do you use, Professor Zhu?  I have heard that Julia is helping grow the use of artificial intelligence.


    Kirsten Rosselot
    Process Profiles
    Calabasas, CA United States

  • 4.  RE: Are you using Artificial Intelligence?

    Posted 20 days ago
    There are a lot of languages for machine learning, for example, MATLAB, Julia, Python, Clojure, Ruby, C++, Go, etc. Personally, I choose Python.

  • 5.  RE: Are you using Artificial Intelligence?

    Posted 20 days ago
    Hello Dan,
    I spent most of my career working in AI software development and applications, mostly for online monitoring and control. Especially, fault detection, isolation, and mitigation, which is a very natural application area.  Or more generally, "abnormal operations".  That's not only because the technology fits, but because there are big incentives to better recognize and then handle "abnormal" operations due to process problems, sensor problems, or operator errors. Faults can be the catastrophic kind, but also the more mundane kind leading to off-spec product or excessive energy consumption. Most of the tools were also useful in other domains such as network management.

    There's always been a problem defining what "AI" means. In practice, it tended to mean "using new-ish computer technology for things that have been hard to automate". It long included things like rule-based systems, neural networks and other pattern recognition and machine learning methods, clustering, reasoning based on causal diagrams or logic diagrams, constraint-based reasoning, goal-seeking, decision trees, etc. This included almost anything requiring an "engine" for operating on data combined with some form of knowledge representation either created by experts or derived from previous data (or often a combination of both). The reality is that practical systems always needed to be "hybrid" combinations of multiple tools. For instance, in real time systems, data is noisy and traditional filtering methods must be applied to avoid "chattering" in the conclusions, accepting some lags in analysis as a result.

    Clustering of data is a significant tool in today's AI, and it's long been a part of the AI toolkit -- for instance, as a starting point for the cluster centers in RBFN (Radial Basis Function Networks). Neural net technology has improved, including the deep-learning variant, but neural nets are still universal function approximators - essentially an improved method of nonlinear regression to approximate a process model or build a classifier.

    These days, "AI" tends to emphasize the "data science" aspects, and everybody claims it and hypes it, even for some of what might formerly have been considered just statistical analysis and model building, albeit with some newer tools such as logistic regression, and advances in clustering.

    See the web page at

    There's relevant links there. In particular, I'd highlight the link to the IFAC paper on lessons learned about knowledge-based systems in process control for actual industrial applications. Perhaps that paper is surprisingly old, but I'd say its conclusions are still valid, even though the specific systems there have probably all been replaced or discarded by now. There's also links to some tutorials on Fault Diagnosis, causal digraphs (CDG), neural nets, logic flow diagrams (GDA), etc., that reference other papers about theory or applications.

    As an aside only partly related to the original question, there's also a link there to "Big Data Approximating Control", which is a new data-driven estimation and control approach that includes real-time clustering as well. This is promising but not yet reduced to practice for control: research is needed to define conditions under which control will be stable (for instance, by selection of the data, etc.). This could be a great topic for some PhD candidate. The method is actually simple, but the proofs for sufficiency conditions for stability may require significant mathematical skills.

    People like to argue about the best platforms. At least in the case of real-time systems, there is a difference between what's useful for flexibility in research and simulated systems, vs. what integrates well with the hardware and supports a useful end-user interface in delivered online applications. So, the best platform depends on your needs. In the old days, we used tools like Gensym's G2 which were flexible and integrated with process control systems. But, that was too expensive and complex for most people, and the general computer language part of it never became popular. It included a real-time rule engine, object-oriented knowledge representation, methods for dealing with aging of data, a basic (non-stiff) differential equation solver, and graphics suitable for building graphical languages. We had to add a lot of functionality in toolkits for things like neural nets, causal model diagrams, logic diagrams, and so on.

    Gregory Stanley
    Performity LLC
    Spring TX