• C++ Programming for Financial Engineering
    Highly recommended by thousands of MFE students. Covers essential C++ topics with applications to financial engineering. Learn more Join!
    Python for Finance with Intro to Data Science
    Gain practical understanding of Python to read, understand, and write professional Python code for your first day on the job. Learn more Join!
    An Intuition-Based Options Primer for FE
    Ideal for entry level positions interviews and graduate studies, specializing in options trading arbitrage and options valuation models. Learn more Join!

What chances nowadays to get into Quant Developer roles with a PhD in Computational Mech. Eng.

Joined
1/18/20
Messages
93
Points
18
Hello all,

I would like to get advice from people inside the world of Quantitative Finance about what chances are there nowadays of getting in as a Quant Developer without much knowledge of Financial Mathematics.

I hold a PhD in Computational Mechanics from a mid-upper tie University here in the UK (Russell Group), so I have experience in numerical methods (finite differences, finite elements, more exotic numerical methods, numeric codes for solving PDEs), and coding numerical algorithms in the object-oriented paradigm, C++98 mainly, and in the procedural one, Matlab and Fortran.
My background is in Mechanical Engineering, and I am by now in my thirties.

Since I got my PhD and cut my further ties with the University this past June, I set out to fill my knowledge gaps eyeing for a role in Quantitative Finance.
I feel my weak spots are mainly in the field of Probability&Statistics (building a base towards Machine Learning eventually), Software Design and Patterns (never read gang of four C++ book) and obviously Financial Engineering itself.
I set out to improve my knowledge on these topics, aiding myself with books, MOOCs and online university lectures, but have since realised that books like Joshi's 'Concepts and practices of math finance', his other C++ book, Hull's book on Options and Derivatives, Statistics textbooks and the likes take a good two or three months of study each, to get a good understanding out of them, and a reasonable number of exercises done.
Even having completed that, I would need further time to go through the interview preparation books like 'Cracking the Coding Interview' and then Stefanica's or Joshi's more quant-focussed interview books.

By now, this period of many months of self-studying has led to much self-doubt because I am quite on my own in doing this, and not in any way involved with the finance world. In this regard, signing up to a MFE degree would have helped.
Keeping my current self-study pace, I believe I may be ready for interviews perhaps in June or for the summer.
Since I am studying for this since this summer, by then I would have spent 1 year or so preparing, which is basically the same period spent enrolled in a MFE (minus the enrollment fee that I do not think I can afford, as right now I have kept afloat through my savings and a 0-hours contract at a menial job).

All in all, you could say my profile fits that of an aspiring Quant of 10-15 years ago.
What do you think my chances are, coming interview time? Should I angle for classic price or risk roles at banks or algorithmic trading, given my background in low level programming languages?
And also, does the classic Quant Developer figure ("glorified programmer" they say) still exists o has it been fully or in part superseded by Machine Learning automated trading?

Thank you for taking the time to read this,
and Cheerio!
 
For C++, Quantnet/Baruch are very popular.

Gang of Four patterns are severely outdated and are a snapshot of C++ in 1992. Things have moved on.
C++ 98 is also outdated in a sense.

The Baruch courses are up to date.

I like the numerical analysis background, it's not everyday you meet someone who knows FEM.
 
Thank you for the answer and the hints.

Your name rang familiar in connection to googling resources on C++. Now I find in this interview your field of research in academia is not that far from mine, as I was/am involved with solving numerically hyperbolic systems of PDEs using meshless methods and trying to apply CFD-type artificial viscosity to Solid Mechanics problems, battling numerical instabilities. In the past I got to know people from Pavia, too... I'd guess Franco Brezzi was particularly active back when you were there.

People in my environment hold the Gang of Four book as a kind of totem, of time gone then we may add...

Regarding C++98/03, I am glad to have learned it instead of the modern revisions.
I got to learn and get used to keep track and clean my pointers by myself after use, calling them by reference to delete them one by one ifn case they are inside a vector, etc. so I did not feel the need of unique_ or shared_pointer classes with all their verbose syntax...
The use of these new classes means one layer of control over the machine I have to give up...
Also, using something as, for instance, 'std::vector<std::string>::size_type' as iterator type inside loops, instead of the now ubiquitous 'auto', does not really inconvenience me too much.

Having said that, what use can there be for FEM, or Finite Volumes, or meshless techniques in the current data science-ridden financial engineering landscape?
 
Regarding C++98/03, I am glad to have learned it instead of the modern revisions.
I got to learn and get used to keep track and clean my pointers by myself after use, calling them by reference to delete them one by one ifn case they are inside a vector, etc. so I did not feel the need of unique_ or shared_pointer classes with all their verbose syntax...
You are programming C, not C++. Even with C++98/03 I can't imagine anyone (except embedded folk or others working with very limiting hardware) managing their memory manually (as in, without RAII) sans a few exceptional cases at some hopefully well labeled, well contained, unsafe part of the code. There is no reason why one wouldn't use boost or even code up a version of shared_ptr oneself. The benefits are myriad.

From what you've described about your background, your greatest asset should be your coding skills, but honestly and I'm not trying to be rude here, you are decades behind and to be frank, I expect you to find it very difficult to use this to leverage into a quant position. This, of course, can be fixed with a bit more study, and you rightly ask whether you're wasting your time. It's not going to be easy, but no, I don't think it's a waste of time for you should have the academic credentials that ought to get you past CV screens to interviews, and then it's just a matter of passing them.

So let's start with the age: Being in your early 30s is absolutely not an issue itself: quite a few people move to the quant space after a postdoc and they'll be around your age. That said, you can probably expect your line manager to be younger than yourself (depending on the firm and how deep the hierarchy runs).

For probability, you'll need the basics to pass interviews, but brushing this up should not be a matter of months. And as for machine learning, you'll now need to make a choice as to whether you want to market yourself as a traditional derivatives pricer, or a machine learning expert. If you've used no machine learning in the course of your studies, this can be a bit difficult---derivatives pricing on the other hand you can claim to have an understanding of to an extent by having done some PDEs. It machine learning is what you want to do, though, there's plenty of non-quant roles, and even more outside finance altogether, so from that angle going all in on machine learning also covers a plan B.

I don't think you'll be expected to know much in terms of financial engineering (definitely nothing beyond Hull or Joshi---and not all of either, probably more like at the level of the interview books which are basically just Black-Scholes), and the positions you should be applying to are internships unless you already have a quant internship on your CV (which I assume you don't). Ideally you would have applied late last year during the internship hiring season, but there might be offcycle hiring happening during the summer. Fulltime junior (i.e. associate) positions, as you will find, typically ask for a skillset of someone who has spent a year or three at a bank, and it is not realistic for you to claim knowledge to this extent (that said: it doesn't hurt to apply and if you do get an interview, they clearly are willing to hire juniors without experience, too, so again I don't think they'd dive too deep into the financial engineering aspect).
 
Last edited:
That was precious feedback, @KillingField, thank you.

So I take I should focus on strengthening what I already know, and just acquire the general picture of what coding pricing models entails by jumping straight on the interview books (cracking the code interview, stefanica braiteasers book and the likes).
Never thought about any internship. Indeed, job listings on the role are less strict on their financial engineering requirements than ads looking for junior quants. But frankly yes, I would be a bit worried about the age difference with other interns.
There is not much going at the moment in the main job boards anyway, most calls involve machine learning and have +200 applicants already. Perhaps coming Spring more opportunities will pop up.
I could also resort to a recruiter once I will feel ready, getting the foot in the door this way I figure would be worth the cut they are going to take.


With regards to the smart pointer issue, I understand mine could come across as a bit of a contrarian statement: after all, that perk must have been introduced because there effectively existed problems keeping track of pointers the old way. Especially when involved in a large project with multiple coders, I believe.

From what I have seen, the more likely instance in which memory leaks may appear is when dealing with a vector of pointers that get cleared (my_vector.clear(); ) without first deleting the single pointers into it. The implementation will not erase them while clearing the vector, and this is the way it happens in C, and in C++ as well. So what there is to do is to set up a loop to delete the elements of a vector of pointers before clearing the vector and exiting scope. This is what I meant i my previous comment.
This "vector of pointers" issue is a bit more problematic than the similar case when using arrays, because it is more obscure: deletion of arrays after usage is more well known and well publicised, and it is something you get warned against by every C++ book.

The RAII principle should be applied when designing classes, pointers are primitive elements. Hence for classes, one could place a line of code "if (pointer_used) {delete *MyClass;}" inside the class destructor to take care of RAII.

Thus, I can understand the introduction of new pointer classes that relieve the programmer from the effort of releasing memory in this specific case. The rest appear as a means to counter coding laziness to me, unless again, we are talking about large projects in which each coder has to deal with code written by someone else, and there may arise confusion about resource releasing of pointers someone else has introduced.
 
Thank you for the answer and the hints.

Your name rang familiar in connection to googling resources on C++. Now I find in this interview your field of research in academia is not that far from mine, as I was/am involved with solving numerically hyperbolic systems of PDEs using meshless methods and trying to apply CFD-type artificial viscosity to Solid Mechanics problems, battling numerical instabilities. In the past I got to know people from Pavia, too... I'd guess Franco Brezzi was particularly active back when you were there.

People in my environment hold the Gang of Four book as a kind of totem, of time gone then we may add...

Regarding C++98/03, I am glad to have learned it instead of the modern revisions.
I got to learn and get used to keep track and clean my pointers by myself after use, calling them by reference to delete them one by one ifn case they are inside a vector, etc. so I did not feel the need of unique_ or shared_pointer classes with all their verbose syntax...
The use of these new classes means one layer of control over the machine I have to give up...
Also, using something as, for instance, 'std::vector<std::string>::size_type' as iterator type inside loops, instead of the now ubiquitous 'auto', does not really inconvenience me too much.

Having said that, what use can there be for FEM, or Finite Volumes, or meshless techniques in the current data science-ridden financial engineering landscape?
Yes, Franco Brezzi was there. Nice, friendly man. The head of Analisi Numerica was the legendary Enrico Magenes. At the time I was researching FEM for 1st order hyperbolic systems of Friedrichs type.
Regarding C++, your approach _seems_ rather archaic and misunderstanding of what smart pointers really stand for. C++ design has moved on since C++98. BTW what about Python?
FEM and FVM are not used so much IMO (takes too long to learn). FDM is much more popular.

"People in my environment hold the Gang of Four book as a kind of totem, of time gone then we may add..."
What's the translation of this in straightforward English?

For the record, I am the originator of the Quantnet and Baruch C++ courses.
 
Last edited:
Yes, Franco Brezzi was there. The head was the legendary Enrico Magenes.
Regarding C++, your approach _seems_ rather archaic and misunderstanding of what smart pointers really stand for. C++ design has moved on since C++98. BTW what about Python?
FEM and FVM are not used so much IMO (takes too long to learn). FDM is much more popular.

"People in my environment hold the Gang of Four book as a kind of totem, of time gone then we may add..."
What's the translation of this in straightforward English?

"People in my environment hold the Gang of Four book as a kind of totem,[. A totem] of times gone [by,] then[,] we may add..."
Better?
 
"People in my environment hold the Gang of Four book as a kind of totem,[. A totem] of times gone [by,] then[,] we may add..."
Better?
Not the punctuation, but what you are trying to say is not clear to me.
So, GOF == old hat? It's analogies I have difficulties with ... so many possible interpretations.
(I applied GOF to many technical apps (CAD, Risk, Process) since 1992. Now I have developed multiparadigm OOP-GP-FP patterns).
 
Last edited:
I went along with the opinion on that book you expressed in your first post here.
One less book to read for me. It's good, so I can go right through with SICP when I will have the time to spare.
I qualify my remarks on GOF is that they are very good but _not_ to implement them in style OOP C++98 style as is done in the GOF book. And many patterns are directly supported in C++11 and Boost.

Don't throw the baby out with the bathwater.
 
Yes, Franco Brezzi was there. Nice, friendly man. The head of Analisi Numerica was the legendary Enrico Magenes. At the time I was researching FEM for 1st order hyperbolic systems of Friedrichs type.
Regarding C++, your approach _seems_ rather archaic and misunderstanding of what smart pointers really stand for. C++ design has moved on since C++98. BTW what about Python?
FEM and FVM are not used so much IMO (takes too long to learn). FDM is much more popular.

"People in my environment hold the Gang of Four book as a kind of totem, of time gone then we may add..."
What's the translation of this in straightforward English?

For the record, I am the originator of the Quantnet and Baruch C++ courses.

Brezzi is known to me because of the LBB condition for mesh convergence in FEM.
Pavia is still prominent, I have met people from there during Computational Mechanics conferences, the group of A. Quartaroni in Maths and that of F. Auricchio in Structural Engineering come to mind.
Me too I worked on the applicaiton of meshless methods on conservation laws, but I am an engineer and not a mathematician, so the focus was on patching up numerical instabilities that would arise through the use of (opportunely tuned) artificial dissipation.

Again on smart pointers, to simplify my stance, as I understand it all the fuss behind the introduction of smart pointers was because of people overlooking to delete pointers once being done with them. I have already recognised this can be tricky at times, for instance when there are containers such as vectors with pointers as components, you can container.clear() but the pointers will be left there occupying memory if they are not deleted in a loop.
Even the habit of wrapping a class around pointers, with counters so they point to unique objects, was conceived because people would not delete it at end usage for unfathomable reasons.
At my level of coding I have always deleted my 'new' objects when needed, so to my knowledge I have never had problems with memory leaks.
So it follows I have never needed smart pointers: if it's not broken, why fix it.
In a way it is flattering to think about it: so many people forgot to use 'delete' so they needed smart pointers introduced.
Probably things change in larger collaborative projects, but I wouldn't know because I have always written and run my own programs.

I don't know much about Python. I know it is used in Machine Learning.
In my code I used Lapack to call numerical algorithm whne needed. With Lapack picking the right subroutine for the job is an art in itself. Using Lapack is considered old school? It is the best library for numerical methods as far as I know.
Then I used Matlab for prototyping, quick visualisation etc, so no need to learn Python.
 
Most of the Python libraries are user-friendly fancy wrappers for Fortran and C++ libraries, e.g. LAPACK, QUADPACK etc. etc.
On a general note

1. Subtype inheritance has fallen out of fashion somewhat which means less pointers to base class.
2. C++11 has a new memory model.



I have already recognised this can be tricky at times, for instance when there are containers such as vectors with pointers as components, you can container.clear() but the pointers will be left there occupying memory if they are not deleted in a loop.

This is scary. This is the kind of C++ code in early GOF Observer pattern. It's awful because the pattern a la GOF is a design error.
BTW what applications need vectors of pointers?? I suspect it is a C++98 fix for Boost Fusion, for example.


RTTI is not fashionable these days.

In a way it is flattering to think about it: so many people forgot to use 'delete' so they needed smart pointers introduced.
I think it is a bit deeper than that. There are tools for this. The deeper problem is who deletes memory and when. Even when you want to delete, it is not exception-safe.
 
Last edited:
In a way it is flattering to think about it: so many people forgot to use 'delete' so they needed smart pointers introduced.
I did not expect to physically cringe reading internet forums about quant employment opportunities, but there we go; What a time to be alive.

Probably things change in larger collaborative projects, but I wouldn't know because I have always written and run my own programs.
Aaaand there we go again, Dunning-Kruger at its finest.
 
I did not expect to physically cringe reading internet forums about quant employment opportunities, but there we go; What a time to be alive.


Aaaand there we go again, Dunning-Kruger at its finest.
Indeed. Writing code in a PhD environment is a million miles away from the same endeavours in Industry.
An overhaul is needed IMO. Things have moved on. But that's only what I think.


“A little learning is a dangerous thing. Drink deep, or taste not the Pierian Spring; There shallow draughts intoxicate the brain, and drinking largely sobers us again.” Alexander Pope
 
Last edited:
I have already recognised this can be tricky at times, for instance when there are containers such as vectors with pointers as components, you can container.clear() but the pointers will be left there occupying memory if they are not deleted in a loop.

This is scary. This is the kind of C++ code in early GOF Observer pattern. It's awful because the pattern a la GOF is a design error.
BTW what applications need vectors of pointers?? I suspect it is a C++98 fix for Boost Fusion, for example.

Happens when you have nodes in a mesh that belong to a hierarchy of a base node class: there are boundary nodes, stress concentration nodes etc each its own child class of a base node class.

In a way it is flattering to think about it: so many people forgot to use 'delete' so they needed smart pointers introduced.
I think it is a bit deeper than that. There are tools for this. The deeper problem is who deletes memory and when. Even when you want to delete, it is not exception-safe.

Deeper than most practical applications. Just want the job done. In a professional setting maybe different as you state in a post below, but how much? Do we need to optimise high-frequency signals to the nano-second in each and every aspect of our quant code?
 
Indeed. Writing code in a PhD environment is a million miles away from the same endeavours in Industry.
An overhaul is needed IMO. Things have moved on. But that's only what I think.


“A little learning is a dangerous thing. Drink deep, or taste not the Pierian Spring; There shallow draughts intoxicate the brain, and drinking largely sobers us again.” Alexander Pope

How to achieve this overhaul (for <£100)? ---> by S. Meyers - Effective Modern C++?
And then dabbling a little with QuantLib, once more privy with the Concepts and Practices of Mathematical Finance?
 
I did not expect to physically cringe reading internet forums about quant employment opportunities, but there we go; What a time to be alive.


Aaaand there we go again, Dunning-Kruger at its finest.

You don't really strive for agreeableness, do you?
 
Back
Top