Even bad data can be put to good use for testing models. One of my recent papers demonstrates that if we know a little bit about how and why data are noisy, we can adjust the results of tests and use the results from the noisy data to improve decision-making about predictive models.
Banks, insurance companies, and brokerages have all been investing heavily in consumer mobile apps for years now. The result: enhanced customer service; bolstered loyalty; innovative services (e.g., mobile remote deposit); improved services for sophisticated users, especially those in more-developed countries; and improved services for unbanked and underbanked households, especially those in less-developed countries. Why, then, are most financial services institutions hesitant to tout their return on these investments? More importantly, how can they bring a real ROI to their bottom line?
The financial services industry in the U.S. insists on both maintaining decades-old domestic standards and developing new domestic standards -- from the Balance and Transaction Reporting Standard (BTRS) to the Fedwire extended remittance formats -- instead of moving to the ISO 20022 global standards. And they’re having a tough time.
On Wall Street, having access to large amounts of pricing data is a key building block for successful trading strategies. In-memory data models are able to collect, analyze and derive trends from the flood of information in the market. Companies then use these models to drive algorithmic programs that automatically conduct trades based on pre-defined criteria. These algos, in turn, can power any number of profit-generating activities, from finding arbitrage opportunities to speculating on emerging trends.
The unprecedented impact of Superstorm Sandy is forcing financial institutions to rethink assumptions about disaster recovery, contingency planning, and the vulnerability of on-premise solutions. And as a result, we may see widespread adoption of cloud-based disaster-recovery solutions in which third party data centers in remote, land-locked locations are relied on to conduct mission-critical technology operations while the client institution itself is handling the disaster.
Online treasury management solutions are not as secure as once thought. With various authenticating methods having been compromised in the past few years, these solutions are popular targets for cybercriminals looking to pocket fraudulent payments.
In the financial industry, software is largely considered a trade secret. Speed is everything in the trading environment - so how an application performs can make or break competitive advantage. But there is a delicate balance of giving back to the open source community while also maintaining competitive advantage and trade secrets. I encourage the financial industry to continue to find that balance, investing and supporting open source projects that can help the industry overall. Insider (free registration required).
Big Data discussions should really be about Big Value Data -- data that’s truly mission-critical to the enterprise -- rather than merely Big Data, the vast majority of which is unactionable intelligence that isn’t critical to most missions.
The recent system outages in banking systems point to an underlying and growing issue within banks. The impact of some of these outages has been so severe that it has led some to describe retail banks as IT companies with banking licenses. While this is an extreme view, the failures do highlight that the technology infrastructure at the core of banking operations is in a poor state -- another item to add to the list of regulator concerns.