Posted
over 11 years
ago
Join William Fellows, Vice President of Research at 451 Research, and three industry experts to discuss the key trends in cloud adoption in financial services, and the essential components of a successful cloud deployment of the OpenGamma Platform.
... [More]
Thursday, October 17
3pm BST / 10am EDT
Register Now
William Fellows, VP Research at 451 Research, will present his latest research highlighting the spend on cloud across different verticals, the transition from on-demand infrastructure to sophisticated cloud management and the key role played by cloud-enabling technologies. Finally, he'll take a look at the way cloud service delivery is rapidly evolving and the way the provider landscape is changing.
You’ll also hear from three industry experts on how they have successfully migrated a business critical application to the cloud:
Rethinking Risk Analytics: Kirk Wylie, Chairman and Co-Founder of OpenGamma, gives a brief overview of the company’s open source market risk analytics platform, and outlines the benefits of deploying OpenGamma’s light, fast risk technology using modern, cloud-centric techniques to achieve elastic burst capacity and measurable cost savings.
Bringing OpenGamma to the Cloud: Duncan Johnston-Watt, Founder and CEO of Cloudsoft, demonstrates how Cloudsoft’s open source Application Management Platform (AMP) has cloud-enabled the OpenGamma Platform allowing it to be delivered as-a-service for a large global sell-side institution.
Finally you will see OpenGamma-as-a-Service in action running on Interoute, Europe's largest cloud services platform, exploiting their multi-zone Virtual Data Centre offering underpinned by MPLS VPN delivered over their advanced fibre optics network.
Register Now [Less]
|
Posted
over 11 years
ago
The ISDA CDS Standard Model is fairly ubiquitous in the pricing and risk management of single-name CDSs. Markit Group Limited are now the custodians of the C code behind the model, and the Bloomberg CDSW page calls this code - therefore traders and
... [More]
risk managers have come to expect to see the results the model produces.
As far as I know, there is no single document explaining exactly how the model works. There are a number of user guides and technical documents (produced by Markit) explaining bug fixes and improvements to the model, but nothing that goes in detail through all the maths.
The code is however open source, so it is possible, as I did, to read through the code and backout exactly what it is doing. My aim was to produce a Java version of the code. One approach is to produce a slavish translation. However, due most likely to the long evolution of the code through a series of developers, the code is fragmented and littered with inefficiencies. A better approach, which I followed, is to understand exactly what the code is trying to do and why (i.e. understand the model), then write a clean-room implementation.
The result of this was code that produced the same numbers (to machine precision) as the C code, but was much faster (to price CDS and calibrate curves) and (in my not so humble opinion) easier to understand.
For historical reasons (mainly due to CDS trades coming from the bond world), the risk management around CDSs has followed a bump and reprice methodology rather than the analytic sensitivities used in the (mathematically similar) interest rates world. There is no reason to prefer these (one-sided finite difference) approximations, other than it is what people are accustomed to seeing. For the ISDA model it is easy to produce analytical first-order risk factors.
In doing all this, I produced what turned out to be a lengthy document, which can be found on our website:
Download ‘The Pricing and Risk Management of Credit Default Swaps’ (PDF)
The paper describes general CDS pricing before detailing the ISDA model. It then discusses analytic risk factors and gives explicit formulae for the ISDA model. It goes on to discuss hedging and portfolio rebalancing - both as they are currently performed, and how they can be performed with analytic sensitivities. Finally I briefly discuss other models for CDS pricing.
I hope this document will prove useful to other practitioners, and will gratefully receive any comments and suggestions for improvements, particularly on how the single-name CDS market currently operates. [Less]
|
Posted
over 11 years
ago
I hope everybody's all done with their summer holidays, and that your livers have dried out. Ours definitely have, so we think it's time for another evening of drinks, talks, and demos. That's right, it's OpenGamma OpenPub time!
Join us at The
... [More]
Counting House near Bank station on Wednesday 23rd October from 6pm onwards.
The very first ever OpenGamma OpenPub was in the Counting House. We liked it so much we're back, but this time, rather than just hovering upstairs, our pub nights have grown to the point where we'll be in a private room. Don't worry, we'll make sure you (and the bartenders) can find us.
We'll be demoing the upcoming 2.1 release of the OpenGamma Platform, and running informal roundtables around specific topics (multi-curves, converting OpenGamma to a cloud service, Excel integration, and more).
Come along to meet the team and other community members, see the Platform in action, and ask any questions you may have.
Simply register on Eventbrite so that we know who's coming. (When registering, please let us know if there are any particular roundtable topics you'd like to discuss with the rest of the community.)
Hope to see you there! Come thirsty!
[Less]
|
Posted
over 11 years
ago
Last week OpenGamma attended and presented at the MATLAB Computational Finance Virtual Conference. The event showcased real-world examples demonstrating how researchers and developers can excel at their jobs, improve their research and business
... [More]
processes, reduce costs, and mitigate risks by using MATLAB.
My presentation focussed on embedding MATLAB components into the OpenGamma Platform. We looked at how to incorporate a MATLAB function into a MATLAB Production Server™ application, declare the Java interface, create the metadata to express input requirements, register the function in the function repository, and visualize the results in OpenGamma’s risk viewer. (If you missed the presentation, you can download the slides here.) Many thanks to all the attendees and specially those who attended my presentation and/or visited our booth.
I managed to attend a few of the other sessions. Here is a quick summary of some of the most exciting topics:
Embracing Complexity: Jim Tung is continuing to drive the effort to help us code more effectively and safely. I really appreciate the improvements in unit testing that went into MATLAB and the road ahead seems quite exciting.
Speeding Up Algorithms:When Parallel Computing and GPUs Do and Don't Accelerate: Excellent presentation by Aly Kassam and Michael Weidman covering in a very hands-on manner the parallelization capabilities of MATLAB.
Evaluating Systematic Trading Strategies:Using MATLAB to Accelerate Quantitative Research: Ben Steiner conceptualized very well the research process that goes into creating a trading strategy and made parallels to other industries. He also gave some interesting workflow tips.
It is also nice to see the Spanish-speaking MATLAB community being represented. Paula Poza made a very good presentation about modelling credit risk. I will try to contribute to that effort by recording our presentation in Spanish and making it available as an on-demand webinar.
We were planning to give away one copy of the Interest Rate Modelling after the Financial Crisis book (OpenGamma's Marc Henrard contributed a chapter on derivative discounting). Since more than 250 of you showed up, we figured it would be better to give two of them away. The lucky winners are Boyi Liu and Philip Bennett - congratulations! [Less]
|
Posted
over 11 years
ago
What does it take to convert the OpenGamma Platform to genuine cloud service, accessible from anywhere in the world, scaling out and back in any given region depending on demand?
We’ve always had at the back of our minds the idea to provide
... [More]
OpenGamma as a Service to smaller financial services organisations or those with limited in-house IT resources. In the meantime, the last couple of years have seen capital markets becoming much more receptive to the idea of cloud services, and many hedge funds have long used SaaS (now rebranded Cloud) risk services. However, due to lack of internal resources and increasing demand from the enterprise market, we haven’t made much progress on that front as of yet.
A recent Proof-of-Concept project by our friends at Cloudsoft shows that we may be closer to that goal than we originally thought.
Cloudsoft were recently tasked with presenting a compelling demo of their application management platform for the group CTO of a Tier-1 investment bank, showcasing their ability to roll out a global service across multiple regions with in-region elasticity to boot. We were thrilled when they contacted us and told us that they’d selected OpenGamma as an ideal candidate to demonstrate the power of their own technology. (The whole process has been carefully documented over on the Cloudsoft blog.)
Essentially, Cloudsoft/Brooklyn acts as a management layer, or broker, that provides OpenGamma resources to different users/apps and reacts to various events - for example, deploying extra services in case of a machine failure, or automatically transferring users to another region should the cloud service for a particular region fail. Clients have a range of deployment and configuration options depending on their particular requirements and constraints.
What Cloudsoft have built makes it easy for financial services firms to deploy OpenGamma by marrying our technology with the underlying infrastructure in a seamless way. The real power lies in Cloudsoft’s ability to offer burst capacity to public/private cloud as and when needed and the ability to use a mix of cloud and non-cloud resources seamlessly with minimal overhead for the client.
The project has certainly made us realise that Cloudsoft has removed several hurdles on the way to providing a full OpenGamma-as-a-Service solution. Whilst this is at the moment a Proof-of-Concept and we’ve still got a few things to figure out, we are looking into providing this as a commercial service at some point in the future.
Whilst this is early days, we are keen to talk to organisations that are going through cloud-related initiatives and reviewing their analytics capabilities. If you’d like to discuss this in more detail with Cloudsoft or OpenGamma, get in touch. [Less]
|
Posted
over 11 years
ago
We’ve just announced the release of the 2.0 version of the OpenGamma Platform over on our Developers site.
It took us a year to bless 1.0 after the developer previews, and we knew this one was just as big a change - hence moving from another 1.x to
... [More]
2.0. We’ve added a lot of functionality - including advanced stress testing capabilities - to address the challenges that our users face on a daily basis due to market volatility, increasing amounts of data, and mounting regulatory pressures. All of that was there all along, but it just took too much work at too low a level to get it out of the system: we’ve made it extremely easy compared with 1.x!
There has also been an enormous amount of changes “under the hood” to get customers into production on the new version - a big thank you to those clients who’ve helped us fine-tune all the new features and enhancements and provided constant feedback on how we can improve the system.
It’s been a while since 1.2, and while customers have had help from us in working with “the new bits”, we don’t want to go that long without a community release in the future. We’ve implemented some changes to our internal development process to ensure that - keep an eye on this blog for updates on this. If nothing else, the move to Maven as our build and artifact system will help us push interim releases out to the Open Source community as easily as we have to our commercial customers!
So what’s coming up next?
Our project roadmap is of course visible on JIRA, but in a nutshell, expect to see more of the same: more instruments, faster performance, more flexibility in getting the calculations that you really want out of the Platform. And a real and committed focus on making the Platform easier to program against. We know it can be a challenge to know just how to make such a powerful and flexible system do exactly what you want. We’re listening, and we’re making it simpler.
In the past few months we’ve been actively talking to various business partners to strengthen our ecosystem and to ensure that our users have access to local development and support resources if they need help with the Platform. We’ll continue to make it easier for these partners to contribute code back to the core project and to provide you with value-added tools around our technology.
Oh, and we’ve also got something slightly different in the works - we’re pretty excited about it, but just need to do a few more finishing touches before we can tell the world just what it is. Watch this space! [Less]
|
Posted
almost 12 years
ago
We often confront situations where only a limited amount of data is accessible and it is necessary to estimate values between two consecutive data points. In finance, as only a finite set of securities are traded in financial markets, it is a key
... [More]
role to construct a sensible curve or surface from discrete observable quantities such that the resulting curve or surface has desirable properties. For example, the absence of arbitrage in yield curve construction requires positivity of forward rates, whereas the derivative values of option are constrained for a non-arbitrage implied volatility surface. The methodology of constructing new points between known data points is known as interpolation.
Recently I’ve been implementing a particular class of interpolation methods and decided that it might be a good idea to write a note summarising all the details on the interpolation algorithms and the properties of their resulting curves. As usual, the note is available as a free download on our website. I hope that the note will be useful for those who are interested in the empirical approach to curve construction in finance using the same class of interpolation methods which I worked on.
In the note I’ve summarised interpolation algorithms based on piecewise polynomial functions which are frequently used to estimate a structure hidden behind a set of financial market data. The piecewise polynomial interpolation is such that an interpolation function is polynomial functions associated with individual intervals of given data and the continuity condition is imposed at every data point. Let me briefly describe the background and our motivation for exploration into mathematics of the interpolation.
Piecewise Linear Interpolation and Cubic Spline Interpolation
One of the simplest interpolation methods is piecewise linear interpolation where the interpolant is piecewise straight lines connecting two consecutive data points (Figure 1). In the figure the dots represent a sample data set. Although the resulting curve preserves the shape of the given data in the sense that the interpolation function is monotone in all of the intervals, it has discontinuous first derivative and the lack of differentiability is problematic when calibrating financial models.
Figure 1. Piecewise Linear Interpolation.
Another popular method is cubic spline interpolation. The data points are interpolated by piecewise cubic second derivative is continuous everywhere. Figure 2 shows the resulting curves of the cubic interpolation on the same data set as the linear case, where the three lines correspond to the choices of endpoint conditions. In many cases the curves suffer from unnatural “wiggles and bumps,” and the shape of the data is not preserved, which is known as the overshooting problem. Moreover changing an input can alter the shape of piecewise polynomial functions far away from the changed data point meaning that the interpolation is nonlocal, which in turn, can suggest inappropriate hedging ratios.
Figure 2. Cubic Spline Interpolation.
The Quest for Shape-Preserving Interpolation Algorithms
Given the imperfection of the two interpolation methods above, one might ask:
Can we obtain a piecewise polynomial interpolation which is as shape-preserving as the piecewise linear interpolation, and at the same time, as smooth as the cubic spline interpolation?
For many years this question has been addressed by applied mathematicians, computer scientists and others in quantitative research, looking for a powerful but simple technique of constructing, roughly speaking, a “good-looking” smooth curve.
After introducing the basic interpolations mentioned above, the note starts discussion on the shape-preserving interpolation algorithms. Due to a number of desirable properties, Hermite polynomials play an important role, where the cubic interpolant is derived once the first derivative values are given at the data points while the quintic Hermite interpolation is obtained if the second derivative values at the data points are given in addition. Thus the interpolation problem based on the Hermite polynomials is to determine the derivative values such that the resulting curve is local and shape-preserving. An alternative algorithm is monotone convex interpolation which was originally designed for yield curve construction, but probably, is of use for other purposes.
Also I commented on two-dimensional spline interpolations and other useful interpolation methods which we frequently come across in finance.
Let me finish this post with a couple of remarks. As pointed out in past literature, there is no perfect interpolation algorithm which is always the best method for any kind of problems. This means that we need to understand interpolation algorithms and choose an appropriate one in accordance with given data and objectives.
The piecewise polynomial interpolations discussed in the note are only a small subset of the piecewise polynomial interpolations, and the piecewise polynomial interpolation method is also a subset of all the interpolation methodologies. For further exploration, see e.g., de Boor’s A Practical Guide to Splines.
Download ‘Piecewise Polynomial Interpolations’ (PDF) [Less]
|
Posted
almost 12 years
ago
The fifth annual R/Finance conference will be held in Chicago this week, 17-18 May. Aimed at users of R, the open source programming language for statistical computation and graphics, the event focuses on using R as a primary tool for financial risk
... [More]
management, analysis and trading. Topics covered include portfolio management, time series analysis, advanced risk tools, high-performance computing, econometrics, and more.
R Integration was included in the 1.0 release of the OpenGamma Platform last year. Since then, we’ve seen many of you using the integration module to perform complex portfolio simulations, statistical analysis and visualizations.
As part of R/Finance, we'll be showcasing the ability to shock curves based on user-defined inputs as well as applying the difference between two different days to any existing snapshot. These techniques can be applied to pre-existing portfolios or portfolios that are entered programmatically from R.
OpenGamma working with R
By modifying several of the OpenGamma R examples, an analyst can create custom stresses and scenarios to perform statistical perturbations on market data and security terms, all fully integrated with the rest of the OpenGamma Platform. Even better, these calculations all happen on the same server-side infrastructure used by the rest of your installation: keeping your workstation free for the work that has to be done on it. We’ll be showcasing the R integration at our stand throughout the event.
If you are unable to make it to R/Finance, or if you’ll be in Chicago a few days early, join us for a pint (or two) at the OpenGamma OpenPub on Wednesday May 15th at Poag Mahone’s. If you work in finance, technology, or financial technology, we’d love to meet you. Come along to meet the team and ask any questions you may have!
Register for the OpenPub [Less]
|
Posted
almost 12 years
ago
Last week, OpenGamma attended the MATLAB for Finance seminar in London. We gave a short presentation on calling MATLAB functions from OpenGamma via the MATLAB Production Server (MPS). This blog post is a summary of the presentation.
Calling MATLAB
... [More]
functions from OpenGamma
Many of our clients and prospects have extensive collections of pricing models written in MATLAB. Our goal is to help you transition these models from research into production into the OpenGamma Platform, allowing you to retain your investments and the flexibility of MATLAB within a real-time enterprise risk platform.
The process is simple and consists just of a few steps.
Step 1: Create the MATLAB function.
This is where the advanced proprietary model code would be written. In this example, we are just using simple Black-Scholes pricing.
Step 2: Create the MPS Server.
Step 3: Compile into a generic CTF.
Step 4: Declare the Java interface.
This interface needs to have a function with the exact same name as the MATLAB function that we want to call, and the arguments need to be of the type that MATLAB expects.
Step 5: Create the metadata to express input requirements.
This is what the OpenGamma engine will use to construct the dependency graph.
Step 6: Register the function.
We register the function in the function repository to make it available in the OpenGamma Platform.
Step 7: Configure a view.
This is where we configure what risk metrics we want to see. In this example, we will be showing the MATLAB-based method next to the OpenGamma native model.
Step 8: Enjoy!
In the risk viewer, we select the view that we just created and numbers start to come up in real time. We can use the powerful dependency graph capabilities to check that the function being called is in fact the one we wrote that uses MPS.
[Less]
|
Posted
almost 12 years
ago
The annual FTF News Technology Innovation Awards recognise organisations and individuals that embrace the rapid changes in technology to achieve business excellence in financial services. 2013 was the first year that OpenGamma applied for the awards.
... [More]
We are particularly honoured to have been nominated for Best Cutting-Edge Solution. It perfectly describes our approach to the market: cutting-edge technology combined with a cutting-edge business model. We promise our customers a legacy-free approach to their market risk and trading analytics architecture.
We were also nominated for the Best Fintech Ops Start-Up award, which recognizes a financial technology start-up - founded within the past five years - that has devised innovative technology that enables end-user firms to differentiate themselves in the market.
Following our nomination for the HFM Awards, FStech Awards, and recognition from Fast Company as one of the most innovative companies in the world, I’m pleased that so many top industry insiders are recognizing just how much demand there is for an Open Source alternative in the quantitative finance space!
The winners are determined by votes cast by industry participants (with a valid business email address). Voting is open until April 30. If you think we deserve to win, please vote for us by clicking the link below - it’ll only take a few minutes!
Cast your vote [Less]
|