Connect with us

Technology

HARNESSING CONNECTIVITY: THE KEY TO GLOBAL EXCHANGE OPERATORS FINDING NEW HORIZONS IN THE CLOUD

Source: Finance Derivative

Offering the massive amounts of storage and compute power that drives our digital lives, the cloud is an incredible innovation, one that has changed the way we communicate, the way we consume entertainment, the way we work – and, yes, the way we do business. But the cloud cannot work without the network infrastructure underneath.

Global financial exchange operators are harnessing cloud technology to make marketplaces more accessible, secure, and cost-efficient. And reliable network infrastructure is critical to enabling the robust and secure connectivity required by buyers and sellers, fund managers, and investment banks in a marketplace.

Scaling up with the cloud

It was the potential of the cloud that caught the attention of Intercontinental Exchange (ICE), a Fortune 500 firm that operates regulated marketplaces, including the New York Stock Exchange, and provides market data and technology solutions for the listing, trading, and clearing of derivatives contracts and financial securities across major asset classes.

ICE has long offered a significant portfolio of data services used to support the trading, investment, and risk management needs of financial institutions, corporations, and government entities around the world. But to scale up, the ICE Global Network team knew they had to extend these services beyond the company’s own network, allowing customers to utilise financial and data services wherever and whenever they were needed.

The answer, they realised, lay in the cloud.

It was a conclusion reached after observing customer behaviour. ICE Global Network noticed that not all of its clients required direct access to its global network via a physical point of presence in order to utilise important business content and services. Aware that many had also woven public cloud services into their in-house computing architecture, ICE knew that it was there, in the cloud, that it wanted to make strides forward.

The end result of this understanding was developing what it calls ‘IGN Cloud Connect’; a service      that provides customers secure access to ICE’s proprietary data content – including ICE Consolidated Feed, ETF Hub, Pricing, and Reference Data – directly from the cloud. An intuitive solution, IGN Cloud Connect is designed for easy customer adoption and integration on a rapid timeframe.

A key part to ensuring the success of this project was designing cloud-based access that was on par with on-premises solutions, including security and low latency. ICE Global Network also needed to make sure that customers could adopt and integrate this new solution quickly and easily to anything they already use. This was where a software-defined network (SDN) made all the difference.

Connecting it all together

With all the flexibility that the cloud provides, matching it with the connectivity element is key to fully utilising its potential – [potential that cannot be reached with the unreliable public internet]. With a software-defined approach, everything is mediated through a software layer that connects the full cloud infrastructure.

Using software-defined networking to build a purpose-built, private network-as-a-service (NaaS) greatly simplifies the process of creating and provisioning hybrid and multicloud networks – making it the ideal solution to match the flexibility that ICE Global Network required, while ensuring that it was compatible with customer needs too.

Utilising a NaaS solution to connect to the cloud also helped to reduce complexity of IT infrastructure and costs. More importantly, for financial services, it can reduce downtime as it helps to virtualise most of the physical networking devices thus reducing the network management and maintenance load.

Another major benefit for exchange operators is that they can quickly adapt to peaks and troughs in demand. Increasing bandwidth when needed can be critical to quickly adapting to changes in the stock market, for example. Ensuring that IGN Cloud Connect has connectivity to the cloud that is elastic — where bandwidth can be turned up or down in an instant – provides additional flexibility for customer needs. It also ensures that core business operations consistently perform if there are peaks in demand and that, equally as importantly, IT leaders are not paying for unused bandwidth that they don’t need.

Bringing the cloud off ICE

By utilising a NaaS solution, ICE Global Network’s new cloud offering doesn’t sacrifice any of the performance, security, or privacy that its customers have come to enjoy when connecting to data services locally. Better yet, the connectivity ensures that IGN Cloud Connect is ‘provider neutral’, meaning users can access ICE Global Network’s data via multiple public cloud environments – Amazon Web Services (AWS), Microsoft Azure, Google Cloud – to name a few. Coupled with a high level of reliability and cost efficiency, it’s a compelling proposition.

With the launch of IGN Cloud Connect, ICE Global Network has successfully expanded its services to prominent cloud platforms in no fewer than twenty-four countries worldwide. With that sort of reach, the company’s customers have access to over 229 cloud on-ramps across North America, Europe, and Asia Pacific – each one performance-tuned and secured against cyber intrusion.

Industry-wide benefits of the cloud

ICE needn’t be the only one looking to the skies with hopes of expansion – cloud technology can be leveraged right across the exchange operator industry if it is coupled with the right connectivity.

Using software-defined networking that bypasses the public internet, exchange operators can offer their customers access to data and financial services with far greater flexibility than conventional means, without losing a shred of the security, privacy, or performance afforded by a local connection.

Add in the cost-saving potential of flexible bandwidth fees and a shorter time to market thanks to smarter connectivity, and it’s into the cloud that becomes an attractive proposition for the exchange operator sector.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

Driving business success in today’s data-driven world through data governance

Source: Finance derivative

Andrew Abraham, Global Managing Director, Data Quality, Experian

It’s a well-known fact that we are living through a period of digital transformation, where new technology is revolutionising how we live, learn, and work. However, what this has also led to is a significant increase in data. This data holds immense value, yet many businesses across all sectors struggle to manage it effectively. They often face challenges such as fragmented data silos or lack the expertise and resources to leverage their datasets to the fullest.

As a result, data governance has become an essential topic for executives and industry leaders. In a data-driven world, its importance cannot be overstated. Combine that with governments and regulatory bodies rightly stepping up oversight of the digital world to protect citizens’ private and personal data. This has resulted in businesses also having to comply e with several statutes more accurately and frequently.

We recently conducted some research to gauge businesses’ attitudes toward data governance in today’s economy. The findings are not surprising: 83% of those surveyed acknowledged that data governance should no longer be an afterthought and could give them a strategic advantage. This is especially true for gaining a competitive edge, improving service delivery, and ensuring robust compliance and security measures.

However, the research also showed that businesses face inherent obstacles, including difficulties in integration and scalability and poor data quality, when it comes to managing data effectively and responsibly throughout its lifecycle.

So, what are the three fundamental steps to ensure effective data governance?

Regularly reviewing Data Governance approaches and policies

Understanding your whole data estate, having clarity about who owns the data, and implementing rules to govern its use means being able to assess whether you can operate efficiently and identify where to drive operational improvements. To do that effectively, you need the right data governance framework. Implementing a robust data governance framework will allow businesses to ensure their data is fit for purpose, improves accuracy, and mitigates the detrimental impact of data silos.

The research also found that data governance approaches are typically reviewed annually (46%), with another 47% reviewing it more frequently. Whilst the specific timeframe differs for each business, they should review policies more frequently than annually. Interestingly, 6% of companies surveyed in our research have it under continual review.

Assembling the right team

A strong team is crucial for effective cross-departmental data governance.  

The research identified that almost three-quarters of organisations, particularly in the healthcare industry, are managing data governance in-house. Nearly half of the businesses surveyed had already established dedicated data governance teams to oversee daily operations and mitigate potential security risks.

This strategic investment highlights the proactive approach to enhancing data practices to achieve a competitive edge and improve their financial performance. The emphasis on organisational focus highlights the pivotal role of dedicated teams in upholding data integrity and compliance standards.

Choose data governance investments wisely

With AI changing how businesses are run and being seen as a critical differentiator, nearly three-quarters of our research said data governance is the cornerstone to better AI. Why? Effective data governance is essential for optimising AI capabilities, improving data quality, automated access control, metadata management, data security, and integration.

In addition, almost every business surveyed said it will invest in its data governance approaches in the next two years. This includes investing in high-quality technologies and tools and improving data literacy and skills internally.  

Regarding automation, the research showed that under half currently use automated tools or technologies for data governance; 48% are exploring options, and 15% said they have no plans.

This shows us a clear appetite for data governance investment, particularly in automated tools and new technologies. These investments also reflect a proactive stance in adapting to technological changes and ensuring robust data management practices that support innovation and sustainable growth.

Looking ahead

Ultimately, the research showed that 86% of businesses recognised the growing importance of data governance over the next five years. This indicates that effective data governance will only increase its importance in navigating digital transformation and regulatory demands.

This means businesses must address challenges like integrating governance into operations, improving data quality, ensuring scalability, and keeping pace with evolving technology to mitigate risks such as compliance failures, security breaches, and data integrity issues.

Embracing automation will also streamline data governance processes, allowing organisations to enhance compliance, strengthen security measures, and boost operational efficiency. By investing strategically in these areas, businesses can gain a competitive advantage, thrive in a data-driven landscape, and effectively manage emerging risks.

Continue Reading

Technology

‘Aligning AI expectations with AI reality’

By Nishant Kumar Behl, Director of Emerging Technologies at OneAdvanced

AI is transforming the way we work now and will continue to make great strides into the future. In many of its forms, it demonstrates exceptional accuracy and a high rate of correct responses. Some people worry that AI is too powerful, with the potential to cause havoc on our socio-political and economic systems. There is a converse narrative, too, that highlights some of the surprising and often comical mistakes that AI can produce, perhaps with the intention of undermining people’s faith in this emerging technology.

This tendency to scrutinise the occasional AI mishap despite its frequent correct responses overshadows the technology’s overall reliability, creating an unfairly high expectation for perfection. With a singular focus on failure, it is, therefore, no surprise that almost 80% of AI projects fail within a year. Considering all of the hype around AI and particularly GenAI over the past few years, it is understandable that users feel short-changed when their extravagant expectations are not met.

We shouldn’t forget that a lot of the most useful software we all rely on in our daily working lives contains bugs. They are an inevitable and completely normal byproduct of developing and writing code. Take a look at the internet, awash with comments, forums, and advice pages to help users deal with bugs in commonly used Apple and Microsoft word processing and spreadsheet apps.

If we can accept blips in our workhorse applications, why are we holding AI to such a high standard? Fear plays a part here. Some may fear AI can do our jobs to a much higher standard than we can, sidelining us. No technology is smarter than humans. As technology gets smarter, it pushes humans to become smarter. When we collaborate with AI, the inputs of humans and artificial intelligence work together, and that’s when magic happens.

AI frees up more human time and lets us be creative, focusing on more fulfilling tasks while the technology does the heavy lifting. But AI is built by humans and will continue to need people asking the right questions and making connections based on our unique human sensibility and perception if it is to become more accurate, useful, and better serve our purpose.

The fear of failing to master AI implementation might be quite overwhelming for organisations. In some cases, people are correct in being cautious. There is a tendency now to expect all technology solutions to have integrated AI functionality for the sake of it, which is misguided. Before deciding on any technology, users must first identify and understand the problem they are trying to solve and establish whether AI is indeed the best solution. Don’t be blinded by science and adopt the whistles and bells that aren’t going to deliver the best results.

Uncertainty and doubt will continue to revolve around the subject of AI, but people should be reassured that there are many reliable, ethical technology providers developing safe, responsible, compliant AI-powered products. These organisations recognise their responsibility to develop products that offer long-term value rather than generating temporary buzz. By directly engaging with customers to understand their needs and problems, a customer-focused approach helps identify whether AI can effectively address the issues at hand before proceeding down the AI route.

In any organisation, the leader’s job is to develop strategy, ask the right questions, provide direction, and often devise action plans. When it comes to AI, we will all need to adopt that leadership mindset in the future, ensuring we are developing the right strategy, asking insightful questions, and devising an effective action plan that enables the engineers to execute appropriate AI solutions for our needs.

Organisations should not be afraid to experiment with AI solutions and tools, remembering that in every successful innovation, there will be some failure and frustration. The light bulb moments rarely happen overnight, and we must all adjust our expectations so that AI can offer a perfect solution. There will be bugs and problems, but the journey towards improvement will result in achieving long-term and sustainable value from AI, where everyone can benefit.

====

Nishant Kumar Behl is Director of Emerging Technologies at OneAdvanced, a leading provider of sector-focussed SaaS software, headquartered in the UK.

Continue Reading

Business

Machine Learning Interpretability for Enhanced Cyber-Threat Attribution

Source: Finance Derivative

By: Dr. Farshad Badie,  Dean of the Faculty of Computer Science and Informatics, Berlin School of Business and Innovation

This editorial explores the crucial role of machine learning (ML) in cyber-threat attribution (CTA) and emphasises the importance of interpretable models for effective attribution.

The Challenge of Cyber-Threat Attribution

Identifying the source of cyberattacks is a complex task due to the tactics employed by threat actors, including:

  • Routing attacks through proxies: Attackers hide their identities by using intermediary servers.
  • Planting false flags: Misleading information is used to divert investigators towards the wrong culprit.
  • Adapting tactics: Threat actors constantly modify their methods to evade detection.

These challenges necessitate accurate and actionable attribution for:

  • Enhanced cybersecurity defences: Understanding attacker strategies enables proactive defence mechanisms.
  • Effective incident response: Swift attribution facilitates containment, damage minimisation, and speedy recovery.
  • Establishing accountability: Identifying attackers deters malicious activities and upholds international norms.

Machine Learning to the Rescue

Traditional machine learning models have laid the foundation, but the evolving cyber threat landscape demands more sophisticated approaches. Deep learning and artificial neural networks hold promise for uncovering hidden patterns and anomalies. However, a key consideration is interpretability.

The Power of Interpretability

Effective attribution requires models that not only deliver precise results but also make them understandable to cybersecurity experts. Interpretability ensures:

  • Transparency: Attribution decisions are not shrouded in complexity but are clear and actionable.
  • Actionable intelligence: Experts can not only detect threats but also understand the “why” behind them.
  • Improved defences: Insights gained from interpretable models inform future defence strategies.

Finding the Right Balance

The ideal model balances accuracy and interpretability. A highly accurate but opaque model hinders understanding, while a readily interpretable but less accurate model provides limited value. Selecting the appropriate model depends on the specific needs of each attribution case.

Interpretability Techniques

Several techniques enhance the interpretability of ML models for cyber-threat attribution:

  • Feature Importance Analysis: Identifies the input data aspects most influential in the model’s decisions, allowing experts to prioritise investigations.
  • Local Interpretability: Explains the model’s predictions for individual instances, revealing why a specific attribution was made.
  • Rule-based Models: Provide clear guidelines for determining the source of cyber threats, promoting transparency and easy understanding.

Challenges and the Path Forward

The lack of transparency in complex ML models hinders their practical application. Explainable AI, a field dedicated to making models more transparent, holds the key to fostering trust and collaboration between human and machine learning. Researchers are continuously refining interpretability techniques, with the ultimate goal being a balance between model power and decision-making transparency.

Continue Reading

Copyright © 2021 Futures Parity.