We spoke to Zach Daniel, who interned at Simudyne for three months, about his time with the company.
How did you find out about Simudyne?
I found out about Simudyne from their feature in the WSJ article “A New Way to Spot the Next Financial Crisis.” While I was studying applied mathematical modeling in school, I spent a lot of my extracurricular time learning about complex systems and the different ways we can model and learn about them. I had already been studying and tinkering with Agent-Based Modeling (ABM) when I read about Simudyne’s work, so I was very intrigued and decided to reach out.
What projects have you been working on?
For the past three months I’ve been working on the Market Simulation team. My main projects include building a user interface for the market simulator, creating synthetic orderbook data, and researching methods for other ABM implementations.
I’d never done any front-end work prior to the GUI project, and I found it more engaging than I had expected. When creating a highly technical product, it’s important to reduce complicated ideas so that they are easily and properly understood by the user. While ABM is considered to be a glass box modeling approach, from the user’s perspective the transparency of the glass box is determined by the design and functionality built into the front-end.
What key learnings have you taken from the experience?
I found Simudyne’s perspective on modeling and simulation to be far more grounded in reality than the methods currently used in industry. Learning how to brainstorm use cases of different ABM techniques and implement them is very challenging but rewarding.
Often the real color lies in the frictions which are disregarded in both linear and mental modeling. Our minds aren’t powerful enough to understand the outcomes of even the simplest and smallest complex adaptive systems. Instead, we rely on anecdotal evidence and learn from past mistakes—but this only allows us to learn from one datapoint (the reality which unfolded). With simulation and counterfactual analysis, we not only understand ‘what could’ve happened’ but also ‘what would’ve been different’ if we had made a different set of decisions.
The ‘butterfly effect’ is one of the problems with applying linear approximations in the real world. This sensitive dependence on the state of a system leads to blow-ups in approximation error even at small timescales. While capturing the full realm of outcome possibilities requires a model with impossibly high fidelity, knowing any information about whether a given outcome is possible allows us to steer towards/away from it, as well as highlight something that should be explored further.
What did you enjoy most about working at Simudyne?
I really enjoyed the team’s workflow. Everyone at Simudyne approaches challenges with extreme confidence in their abilities, but a humility towards their own knowledge and ideas. Each project begins with research or a literature review—seeing how others have attacked a problem and then trying to improve on the most successful attempts. Learning quickly is extremely important because you’ll rarely (if ever) be tasked to do something you’ve done before. At the same time, the Simudyne team always has advice, or knows where you can find it. They have an amazing ability to understand and engage with each other’s work.
What did you find most challenging?
The most challenging part of working at Simudyne is that nothing is done by the book, because the book is still being written. There is no industry standard to improve on, but rather a modeling language and perspective to build. In applying these ideas, a lot of thought and questioning is needed for model validation and knowledge validation.