The Anonymous Widower

‘Digital Twin’ To Support More Robust Timetable Planning

The title of this post is the same as that of this article on Railway Gazette.

When I saw the title of this post, I was surprised that it was a newsworthy story.

I have been doing scheduling of people, machines and other resources since the late 1960s and creating print outs and graphs to help people to manage businesses from the early 1970s.

In so many cases, I’ve found digital models have given a great insight into the interactions between factors affecting the system.

So I would have expected train companies to all have a digital twin from at least 1980, especially as I know BT and other phone companies had digital models of their networks by that time.

If they don’t have a digital model of their network, how do train companies plan their timetables?

By trial and error!

Or do they start with marketing ideas like four trains per hour and then fit the timetable together like a jigsaw?

September 10, 2019 Posted by | Computing, Transport | , | Leave a comment

Building Scientific Models with Computers

This was the title of a lecture at University College London, that I attended yesterday lunchtime.

It was an excellent lecture and in some ways it was like going back forty years to when I worked at ICI Plastics in Welwyn Garden City. In fact two topics, that were discussed by Professor Catlow, were similar to problems I tackled all of those years ago.

The first was the problems of turbulent and other flows.  We had been interested in what happened inside an extruder as you used it to force plastics, such as polyethylene, polypropylene and PVC into moulds to produce the products needed.  It was an intractable problem then and I suspect it might be almost as bad today. Although computers are now bigger and can handle many more nodes than the hundred or so, we could handle on our PACE 231R or with IBM 360/CSMP.

I also found his discussion of the various forms of molecules and how they could be predicted fascinating and if we’d had someone with his knowledge, we’d have got a lot farther with another problem.

When you create polymers, you create long chains of molecules like ethylene and propylene etc. which lock together like a series of odd-shaped Lego bricks. These chains then bind together to form the items we need.

At the time, ICI were trying to create an engineering plastic, which would be stronger and have a greater temperature range. I won’t name it here, as I don’t want to break any confidentiality, but suffice to say that the monomer or polymer building block, needed to be created as a straight molecule for the integrity of the plastic. It was known that several forms of monomer could be created and that there was a rather complicated separation process to extract the straight ones.  Just as in Professor Catlow’s example yesterday, water in the reaction, was one of the factors, that  affected the proportion of desired monomer.

Now I’m not a chemist but I was asked to look at the physics and dynamics of the reaction, with respect to removing the errant water from the reaction vessel as soon as possible after its creation, to reduce the damage it could do.  In the end, I made myself very unpopular, as I often did, by finding a method that removed the water.  I can remember searching Chemical Abstracts and finally found the data I wanted in a paper published by a Chinese researcher working in Canada in 1909. We don’t know how lucky we are with Google and the Internet.

I left ICI soon after I completed this work, so I don’t know the final outcome!

But to me, the exercise proved the value of using dynamic computer models based on differential equations, to understand difficult systems.

In some ways, I was able to do this work, because I was properly taught calculus and how to form differential equations at school.  Would such an important subject now be taught to sixteen-year-olds  as was regularly done in the 1960s at schools similar to the one I attended?

January 21, 2011 Posted by | Computing, World | , , , | 2 Comments

The Virtual Beagle

The headline of “It might look like a dog’s dinner; but this artificial stomach will save (canine) lives” caught my eye as I read The Times this morning.

Apparently, AstraZeneca have virtually replaced dogs with an artificial stomach for drug testing. So not only is it good for drug development, it’s good news for dogs.  I’ve always felt that animal testing was wrong from a scientifically correct point of view as keeping animals is expensive and the in vitro and computer alternatives are cheaper and much easier to scale up.

The Times article doesn’t say who is behind this development, but it does quote Troy Seidle of the Humane Society International as saying.

This new use of the intestinal model in drug testing is a fantastic example of how innovative technologies can replace animal experiments and improve medical research at the same time.

I have searched the Internet and it would appear that the company behind this wonderful development could be SimCyp, based in Sheffield.

But why is everybody being so coy about this development? This British company should be on page one of all the newspapers.

On a personal note, I was involved in computer simulation of processes for several years in the 1970s, when I worked at ICI.  We always felt that computers had a large part to play in modelling the body, but little seems to have been heard over the last four decades. These are two pictures of the PACE 231R analog computer, I used for simulation of chemical processes.


In my view, there are computers, good computers and the PACE 231R.

The 231R was built in the 1960s and it was all valve or vacuum tube, if you are from the United States.   It was a formidable beast for solving differential equations and I have a feeling that there isn’t one left even in a museum.  These pictures taken by a colleague at ICI seem to be two of the only ones of a 231R in a working environment. Hopefully the Internet will preserve them for ever!

The biggest claim to fame of the 231R was that two of them were used in tandem to solve all of the mathematics and differential equations of getting the Apollo spacecraft to the moon. They were actually linked to virtually a real spacecraft to test everything out.

So when Apollo 13 blew up and they had to use the Lunar Excursion Module to bring the astronauts home, it was these two computers that were reprogrammed to try to find out how to do it. They wouldn’t have stood a chance with a digital machine, but the engineers, programmers and astonauts were able to get the two 231R’s to find a strategy. I’ve never seen the Apollo 13 film, but I suspect that the role of the 231Rs is downplayed or ignored.

So when you ask me, what is the greatest computer ever made, there is only one answer.  The amazing PACE 231R.

January 8, 2011 Posted by | Computing, News | , , | 3 Comments