0
I Use This!
Activity Not Available

News

Posted almost 12 years ago
Over 200 attendees, 37 projects, and more lines of code than one dares to count - it seems safe to say that the first NYC FinTech Hackathon was a roaring success. The OpenGamma team flew over from London to help attendees working with our APIs. We ... [More] were extremely impressed by the enthusiasm of the attendees to produce something in a very short space of time that often combined many of the technologies on offer. There was a wide mix of participants with a range of finance and non-finance backgrounds, some of whom came as a team, and others who formed teams on the first day or worked by themselves. The winner of the best use of OpenGamma was Arcadius Kazimierski, who produced a slick multi-touch Windows 8 application for reconciliation. He gave a captivating presentation to the packed room, attracting interest from the judges. He was able to show that modern technologies could be used to improve the tools available to traders, and chose to use OpenGamma’s RESTful API to source time-series data and display it in his .NET application. A technically impressive extension to OpenGamma was the result of another project to perform calculations in the cloud. Being open source, the OpenGamma source code was modified to create a version of the Platform that dynamically fired up calculation nodes on Amazon EC2 instances to allow huge scalability without the physical hardware. We hope that those of you who didn’t get time to incorporate OpenGamma in your hacks will be able to download and explore our technology in your own time. Seeing people use our APIs in action also made us realise there are a few ways we could make it quicker and simpler for you to integrate OpenGamma into other projects. Watch this space for announcements on this. [Less]
Posted about 12 years ago
OpenGamma are participating in the inaugural FinTech Hackathon in New York City on 6th to 7th April. The goal of the hackathon is to promote the development of financial technology for institutional customers and consumers using open source ... [More] software. Attendees will have the opportunity to utilise new financial APIs (including ours) and experiment with a number of platforms and systems. Other companies involved include 10gen, OpenFin, StockTwits, Bloomberg, The Fintech Innovation Lab, and more. During the hackathon, we'll be providing attendees with access to real market data and a preconfigured environment containing: security reference data in our security master historical time series (day duration) holiday calendars region/currency info data curve and surface definitions as well as access to our Microsoft Excel plug-in and the R client. There is a $10,000 prize up for grabs for the overall winner; additionally, we'll be awarding a Chromebook Pixel to the Best App built with OpenGamma's technology! What can I hack? We look forward to hearing your ideas, but here are a few just to get you started: Interfacing data from external sources, such as Yahoo! Finance or Google Finance Using different technologies for storing time series or tick data Using different technologies for the shared value caching in our engine (we currently use EHCache, but you could use Cassandra, Memcached, Riak, etc) Alternative JavaScript graphing/charting libraries (Candles, Band overlays, etc) Cassandra/Voldemort/Hadoop storage of individual result points in a stream Web-based market data snapshot editing and visualisation View data sets stored in our batch database in the web UI Interested? Register here for a full 24 hours of FinTech Hacking! [Less]
Posted about 12 years ago
Over the past few months I've been working on solving the type of one-dimensional PDE problems that occur in finance. The outcome of this work is a reference implementation using finite-difference methods written in MATLAB. In this code (which can be ... [More] downloaded along with the accompanying paper below) the following problems are addressed: European call and put options Barrier Option American call and put options all of which can be priced with a local volatility surface and term structures of rates and yields. The first thing to say is that I do not have a background in PDE solving or applied maths per se, but I do have a PhD in particle physics and am comfortable around equations. The second thing is that, until recently, I'd not programmed in MATLAB. Existing literature on the finite-difference method Almost every textbook on "how to be a quant" has a chapter on the finite-difference method, concluding with Crank-Nicolson as the last word (even though it dates from 1947). The more advanced and specialist books do go deeper into the subject, but often delve into excruciating mathematical detail which detracts from the main message. At the same time the texts often gloss over the applied maths witchcraft and rarely provide anything but the most trivial of code implementations. It was after wading through this literature that I wrote the first OpenGamma PDE solver in Java. This worked and produced accurate prices in an acceptable time, but I lacked the tools (both intellectual and computational) to really analyse what was going on in the system. Some refactoring to clean up the code aside, the system remained unchanged. PDE solving in MATLAB It was then suggested by our (then) new hire, Stuart Archibald, that I start by trying to use spectral methods to solve PDEs. To this end he gave me Spectral Methods in MATLAB by Lloyd N. Trefethen. I very much like the style of this book. It is fundamentally practical, discusses a concept, and then gives working MATLAB code that generates the figures that demonstrate that concept. In just 152 pages I learnt about spectral methods and how to code MATLAB - a fantastic two-for-one offer. Especially when working with matrices, MATLAB is much more terse than Java (sometimes you need 10 times as much code in Java to do the same thing as MATLAB). This, together with the built-in functionality, made me think it would be a good idea to template models in MATLAB before writing a Java implementation. For the PDE solver, this was a little backwards as I already had a Java implementation; however, I reasoned that rewriting it in MATLAB from the ground up would give greater scope for experimentation. Whilst writing the experimental code it became apparent that many numerical problems boil down to (repeatedly) solving matrix systems. In the case of these PDE solving methods, all one needs to do is put the right numbers in the right place in the matrix - something that is glossed over as if it is trivial, but will be the source of nearly all your bugs (getting suitable, consistent boundary conditions being the most important!). If you are lucky the system will blow up and you'll know you have a bug, but often you'll get an answer that is essentially correct, but not as accurate as it should be, or the solution path is inefficient. I made these bugs and found them. I also found the tweaks that make the system faster/more accurate. A mini-guide to the basics of solving PDEs occuring in finance In documenting all this, I decided it might be a good idea to write a paper, in the style of Trefethen, with all the details, that used the MATLAB code to generate all its figures so it was completely transparent. As usual, both the paper and the MATLAB code are available as a free download on our website: Download ‘Numerical Solutions to PDEs with Financial Applications’ (.pdf) Download the MATLAB code (.zip) I hope both the paper and the code will prove useful for people going through the same process I went through. I cannot guarantee that either is bug free, but we welcome (constructive) feedback and bug reports, and will endeavour to merge changes in the paper and code. Final thought You may well be wondering how a Java shop is going to use MATLAB implementations of numerical methods in their code. Clearly we could translate by hand the code into Java; however, this would be expensive in terms of time and not particularly flexible (say if we need to fix a bug or update/alter the methods). With this in mind, quite some time ago Stuart began a skunkworks project on a Java-based DSL called DOGMA that allows rapid translation of MATLAB-like code, including its terseness, to Java with a whole load of go-fast magic built in... watch this space for more! [Less]
Posted about 12 years ago
While the industry hasn't completely recovered from 2008 (and I don't think ever will fully), we’re starting to see a new reality settling in where risk management is at the forefront of all decision making at buy-side firms. Between risk managers ... [More] , traders and quants, the analytics that are used at the heart of business decisions will be significantly impacted by regulatory demands (even for buy-side firms, some of whom thought themselves immune even a few years ago by their hitherto unregulated status), market structure changes, technology developments, and the continuing cost pressures of the new low-margin, low-interest rate regime. I believe that plays out in several market changes and requirements that will impact analytics and risk management in the next 12 months: A new level of transparency One of the trends that I'm seeing the most in interacting with industry professionals is a quest for transparency. Black-box numbers just aren't good enough anymore. I believe that transparency across the entire risk technology stack will drive a significant amount of internal investment and activity: the data that’s used for calculations, the methodology employed, the configuration, right down to the code driving the calculations. Firing off a portfolio to a black-box API and getting a PDF report back just isn’t good enough anymore for skilled buy-side institutions. Industry equality Until recently, if you wanted the level of technology available to the massive Tier-1 banks or $20Bn+ AUM hedge funds, you had to be willing (and able) to spend millions on a bespoke infrastructure. Whether it’s cloud hosting (saving on data centre costs) open source infrastructure (saving on software licensing), or projects like OpenGamma, even a startup hedge fund now has access to the same standard of technology as the entrenched industry players. I believe that will lead in 2013 (and beyond) to a reduction in startup costs for new entrants, and a leveling of the playing field for everybody else. In fact, in some cases I think newcomers (or early adopters) will have significant advantages by virtue of being "legacy free". Continued cost pressure How do you achieve alpha when worldwide interest rates are hovering near 0? More importantly, if your firm is paid for performance (as most buy-side firms are), how can you continue to fund operations when a good return might be several percent per annum below what it was in 2007? Simply put, the only way to achieve the types of return on equity that firms now need is to look at their cost line rather than their trading PnL. You can be incremental or transformative in cost reduction, but this cost pressure is impacting risk technology as much as it is in every other area of the business. From batch reports to real-time risk When we got started with OpenGamma, many risk managers and traders still weren't sure whether they needed real-time risk capabilities. Part of that I believe was that achieving that goal was so expensive (requiring custom build-outs) that very few risk managers and traders had experience with how transformative a real-time, live view of their exposures and risk could be. But continued market unpredictability is driving every firm we talk to to see just how they can make a change from a static, overnight, PDF-based approach to risk to a real-time, dynamic, interactive view. I believe that trend will accelerate over 2013. Conclusion The financial services landscape is becoming more complex and unpredictable due to shrinking IT budgets, a continued call to do more with less to manage unpredictable data volumes, and shifting regulatory mandates. All this calls for a new standard for risk analytics, and firms must embrace the most cost-effective, flexible and powerful analytics solutions available. Firms forced to contend with market-wide calls for greater accountability, real-time transparency and improved risk management practices will quickly realize where their dollars are best spent - and the unique value that open source solutions can bring them - leading to the new era of open source in capital markets. What are the key issues that you think you'll need to face this year? What changes will you be making in your risk management and analytics architecture in 2013? Share in the comments below. [Less]
Posted about 12 years ago
This is the first post in a two-part series: in this post, we'll show you how to execute from Java into MATLAB. In the next post, we'll bring it into the full OpenGamma environment. At OpenGamma we regularly use MATLAB for prototyping some of our ... [More] models internally. However, we also want to give our users the ability to interact with the OpenGamma Platform directly from MATLAB. I’ve been a happy user of MATLAB for almost 14 years now; in my previous job, I developed a front-office sales infrastructure for OTC derivatives in MATLAB that proved to be quite successful. I was therefore keen to repeat that success here at OpenGamma. Integrating MATLAB with the OpenGamma Platform We’ve recently become an approved MATLAB partner (profile soon to go live on the MathWorks website). This allows us to work closely with the MathWorks team to ensure that integration between MATLAB and the OpenGamma Platform is as seamless as possible. We are currently working on integrating the OpenGamma Platform with MATLAB on three levels: Market Data. Functionality for MATLAB users to be able to query the OpenGamma market data server components for historical and real-time data, as well as static data information (such as bond conventions, etc) (provided by OpenGamma, not part of the Mathworks Datafeed Toolbox). MATLAB Client. Identical in nature to our R Integration Module, but using all the MATLAB language features to make it feel native, we will enable users to interact with the OpenGamma Platform from MATLAB. This will enable all the power of the OpenGamma server to be used from MATLAB. This will enable pre-trade analysis, generation of scenarios in MATLAB and sending them over to OpenGamma for portfolio simulations, injecting to the OpenGamma server market data that has been filtered or calibrated in MATLAB. MATLAB analytics plugged in to the OpenGamma server. The OpenGamma Platform was built from the ground up to be able to use external models. As MATLAB is one of the leading platforms for prototyping quant models, we recognise users must be able to plug in those models directly without having to re-code them in another language. Our vision is that using the MATLAB Production Server in conjunction with the OpenGamma Platform clients will be able to use their existing MATLAB assets seamlessly. In this blog post I will be talking about the last part, using "in-house" models written in MATLAB directly from OpenGamma. MATLAB Production Server The MATLAB Production Server is a relative newcomer to the MATLAB family, but it builds upon the MATLAB compiler products and MCR technology that has been well tested for several years. Having just installed it on my machine I was able to put together in a matter of minutes an example pricer for vanilla options. Below is the MATLAB Function that does the math. This particular example is using the Financial Toolbox blsprice function, but this is where you would put in your own model: Then we move into Java and we create an interface that describes the function. This is extremely convenient because it allows for automatic data conversion of inputs and outputs (Java -> MATLAB -> Java) and therefore avoids the step of having to deal with MWArray objects, which used to be the case in the MATLAB JA Compiler product. And finally, this is a simple program that establishes a connection and prices an option: In my next post, I’ll show you how to integrate MATLAB into OpenGamma. [Less]
Posted about 12 years ago
Last week OpenGamma attended the GAIM USA conference in Boca Raton, Florida. Labeled as “the kick-off hedge fund event of the year”, GAIM brought together managers and investors to exchange ideas and debate current issues in the industry. Although ... [More] the event was slightly smaller than previous years, it attracted over 400 attendees representing close to 200 companies. The dominant theme at the event was a realization of the fact that the “golden age of alpha” is over for hedge funds. With around 8,000 hedge funds competing in a crowded field, we are seeing both fees and performance coming down. Add to this a myriad of new regulations and negative returns in two out of the last four years, and it is no wonder that hedge fund managers are fervently debating exactly where strong returns might come from after a disappointing 2012. Making sense of regulation One of the most interesting sessions I attended was a debate on regulation moderated by Charles Millard (Citigroup). David Kotz (former SEC Inspector General) pointed out the regulators’ apparent lack of interest in the opinions of people in the field who really understand the markets. Kotz had been very involved with the Dodd-Frank rewrite, and according to him, the people ‘on the hill’ would routinely disregard his team’s recommendations. It quickly became apparent that unless his office agreed with every piece of legislation handed down from Congress, they would be ignored. Millard then steered the conversation to Dodd-Frank and whether the industry was better off because of it. Pippa Malmgren (Principalis Asset Management) raised the philosophical question of whether government can protect people from a loss in the first place. According to her, the perception is that due to the financial crisis, citizens will have to work longer than they expected for a lower standard of living than they anticipated - and consequently would want to hold someone accountable for this. Malmgren explained that until we have a ‘bad guy’ that is publicly pulled off the stage, the public desire to feel the swamp is clear of the sharks is not yet fed. We will therefore continue to see a more aggressive regulatory approach. The panel concluded that the SEC does not seem to have a clear understanding of its role. It doesn’t want to get into spats that it will lose, which would undermine its authority; therefore it will focus on battles it knows it can win. In terms of impact on hedge funds, Kotz pointed out that the overall cost of all the disclosures is considerable, and they have little value. In many cases, documents are collected and just filed away. The SEC doesn’t have the resources to do much with them, so in many ways it can be seen as a waste of money. However, Paul Atkins (former SEC Commissioner) reminded the audience that because many people within the SEC do not understand what they are examining, it’s crucial that you ensure your papers are in order. Do emerging managers have an advantage? During the Hedge Funders of the Future: The Next Generation session, a group of hedge fund CIOs discussed whether emerging managers have an advantage over the larger, more established players. The panel agreed that there is a hunger and a passion in the smaller managers to be contrarian, and do what they need to do to perform. They also tend to be more nimble and are able to take advantage of more opportunities that a large organization might not be able to exploit. One panel member described starting his journey at large bureaucratic institutions, but that he now enjoys the smaller framework with the ability to make quick decisions. Larger peers take longer time to organize their resources, and go through checks and balances. A smaller framework allows you to be relevant in more opportunities. It also means you don’t have to deploy large amounts of capital for a particular strategy to be effective. Smaller managers can be more focused and concentrated: it is much harder to move the needle when you are in the $1-10 billion bracket. But there are exceptions. For example, Blackrock was described by one panel member as having a very strong entrepreneurial spirit despite being a large organization. Each business unit makes its own decisions and has its own responsibilities, yet has access to enormous firmwide resources. The company has also architected robust risk systems and technology, and can tap into a team of central bankers and some of the best economists for additional insights. Nevertheless, small and mid-size funds will continue to face more regulatory pressure. Form PF and other new regulations will force these funds to spend valuable time and expenses on compliance, with little or no value to the funds, their investors or the ultimate taxpayers. Still cautious The environment has improved over the last six months for the hedge fund community, but the overall atmosphere at the conference was still one of caution. With rising costs of running or starting a hedge fund, managers are keen to keep costs down while coming under more and more scrutiny from regulators and investors. At OpenGamma, we are keeping a close eye on developments, and are working on both a hosted version of the OpenGamma Platform (to cater to smaller funds) and on ensuring that the Platform supports both current and future regulatory demands. We believe that a modern, open source platform for risk and trading analytics will benefit the buy-side as a whole, future-proofing risk architectures without a large, upfront investment or forklift replacement of monolithic legacy systems. [Less]
Posted about 12 years ago
Join us for the first OpenGamma OpenPub London of the year at The Golden Fleece in the City (8 Queen Street, London EC4N 1SP) on Thursday 31st January from 6 pm onwards. Our previous OpenPub was held in Mayfair, so we thought we'd move the party ... [More] to the City for a change. The Golden Fleece is a fantastic pub with an Edwardian interior; you'll find us in the private function room. We'll be sharing some of the exciting plans we have for 2013, and showcasing the new 1.2 release of the OpenGamma Platform. Join us for a pint (or two), meet the team and ask any questions you may have (or celebrate the end of your dry January, if you've been abstemious this month). As always, the drinks are on us! Simply register on Eventbrite so that we have an idea of who's coming, and how big an area to reserve. Hope to see you there! For those of you in New York, have no fear. There's still a chance to register for tomorrow's OpenPub NYC, and thanks to my willingness to burn air miles, I'll be there in time for the festivities tomorrow (back at the Wheeltapper from 6pm, British Airways willing). [Less]
Posted about 12 years ago
In my previous blog posts, I described the implementation of Algorithmic Differentiation (AD) in our OG-Analytics library. The last part of the post was devoted to a special technique that combines algorithmic differentiation and implicit function ... [More] theorem in the case of calibration through root finding. The technique further improves the efficiency obtained with AD. The paper describing the methodology will appear in a forthcoming issue of the Journal of Computational Finance; the preprint version is available on our website. Since the writing of the blog posts and the above paper, I have developed the method further. It can also be applied in the case of calibration by least square (minimisation of errors). The technique is very similar except that it applies to the derivative of the minimising function, for which the derivative is 0 at the minimum, instead to apply on the root finding function directly. As the derivative of the minimisation function is used, the method requires a second order derivative. Fortunately, the second order part effect is very small to the point it can be neglected in practice. The efficiency gain with respect to the textbook implementation is even larger in the least square calibration case. The theory behind the extension and some examples of implementation for practical financial problems are described in a new working paper now available on our website and on SSRN. Algorithmic Differentiation in Finance: Root Finding and Least Square Calibration (PDF) I will be giving a presentation on the results at an Imperial College Finance and Stochastics Seminar on Wednesday 30 January 2013. The title of the presentation will be "Algorithmic Differentiation in Finance: Calibration and Implicit Function Theorem". The seminars are open to everyone and no registration is required; visit the Imperial College website for more information. [Less]
Posted about 12 years ago
A lot of you have been asking us about our business model - we’ve seen your questions on online forums, noticed your Google searches, and spoken with you at various events. In this blog post, I’ll try to explain how OpenGamma as a company makes ... [More] money, and what plans we have for the future. Commercial Agreements Right now we sell an annual support agreement that we call internally “OpenGamma Enterprise”, which is most appropriate for sophisticated firms who want to run a customized on-site installation of the OpenGamma Platform. We charge a single, all-inclusive annual fee, which includes four things that customers have told us are of significant value: 1. Proprietary Components. While the vast majority of our intellectual property is open source and released under the Apache license, there are some components where we're legally prohibited from releasing our code under an Open Source license. These are primarily data adapters for certain data feeds (like Reuters and Activ) and trading systems (such as Enfusion Integrata) where the vendor doesn't allow us to release our code Open Source. However, for almost all of these components, customers have access to the full source code on a perpetual royalty-free license. 2. A Commercial License. While techies are usually fine with the Apache license, procurement departments often aren't. They like a commercial license with things like representations and warranties about the software, IP Indemnification, etc - basically someone to sue. Pay us, and we'll be that someone. 3. Confidential Support. We take pride in our high level of support for the Open Source community. If you're fine with your developers and users posting messages to forums, chatting on IRC, or asking questions on sites like Stack Overflow, we're more than happy to support you for free. But in return, we insist that all of this support has to be google-able to help the rest of the community. Many firms aren't comfortable with that. If you want your developers and end users to be able to pick up the phone and call us, send us an email or for us to have a Service Level Agreement (SLA) with you, then we need a commercial agreement in place. 4. Proactive Configuration and Maintenance Advice. The OpenGamma Platform is an extremely large, complex system, that can do a massive number of things extremely well. But even with the best documentation, figuring out how to make it do just what you want can take valuable time and effort on your part. Many customers prefer to be able to get on the phone, say what they're trying to achieve (such as a particular metric calculated using a particular model with particular parameters), and have the process take 15 minutes rather than a couple of weeks. But because our customers are free to switch to self-support, we have to constantly prove ourselves in our support model. We do things other vendors won't. For example, we have remote login to some of our customers’ production environments to be able to do things like proactively monitor their installation and do log file checking. We maintain a virtualised environment that has the full production configuration, though not the data, of all our customers, so we can do things like pre-testing upgrades. And for some of our customers, we even share some source code that integrates their code with the Platform, so we can offer proactive advice or even include their code in our nightly QA process so we can make sure that we don't break their code. Given the cost of staff at banks and hedge funds, we've never had an unfavorable ROI story; in fact, quite the opposite. Future plans The model I’ve described above is fantastic for firms with on-site developers and sophisticated requirements, but it's more expensive and comprehensive than some firms require. In the future, we'll have additional revenue streams to bring the power of the OpenGamma Platform to everybody else, through things like hosted services, pre-packaged applications and OEM arrangements with other software and service firms. Expect to hear more on this in the New Year. [Less]
Posted over 12 years ago
In this blog series, we’ll introduce you to various members of our team to give you a flavour of what happens behind the scenes at OpenGamma. Ken Wong has just joined OpenGamma and will be working as a Technical Services Manager in our London ... [More] office. What's your background? My career experience has been quite varied. Before joining OpenGamma, I co-founded and ran an FSA-authorised investment advisory company focusing on clean tech. Prior to this, I spent seven years with JPMorgan in Exotics & Hybrids in London and Tokyo. I managed the Tokyo Hybrids trading desk; pricing, trading and risk-managing long-dated cross-asset derivatives (IR, FX, equity indices and some commodities). In my early days at JPMorgan I was a Hybrids bookrunner producing daily P&L/positions, giving me a solid understanding of front-to-back processes and systems integration. I started my career at Andersen working with clients in financial markets. Why did you decide to join OpenGamma? I wanted to use my experience in finance, business and technology and work with smart and interesting people in a cutting-edge fintech company. OpenGamma is building something no-one has done before: an open-architecture, open-source platform for capital markets trading and risk analytics. Having used a variety of pricing and risk management systems on the trading desk, I have an end-user appreciation of how innovative and powerful the OpenGamma Platform is. Following the latest round of Series C funding, it is an exciting time in OpenGamma's journey and I have the opportunity along with the whole team to help make it a commercial success. Also, who can say no to a perpetually fully-stocked fridge of Diet Coke and Dr Pepper? What's your role going to be like? My role will be engaging clients, prospective customers and partners. The OpenGamma Platform has been designed and engineered to handle the requirements of anyone who uses risk analytics, e.g. buy-side, sell-side, clearing houses, exchanges - and so there will be a lot of scope in my role working with such a diverse customer base. I will be involved in business development, pitching and demonstrating the platform, project-managing client engagements from initial requirements analysis through to deployment, integration and long-term support. On the whole, I think the role will be hard work, a lot of fun and very rewarding! [Less]