Data: Complete visibility and control of the pipeline

Data: Complete visibility and control of the pipeline

Share post

A new technology offers companies a single pipeline to manage and control their petabyte-scale data collection. This enables you to carry out reliable and cost-effective analyzes and automation.

Modern clouds generate an enormous amount and variety of data. Business stakeholders want more data-driven insights and automation to make better decisions, increase productivity and reduce costs. However, creating a unified and user-friendly environment for data analytics and automation is difficult due to the complexity and diversity of modern cloud architectures and the variety of monitoring and analysis tools in companies.

Data pipelines, analytics and automation require high security standards

Additionally, companies must ensure that their data pipelines, analytics and automation comply with security and privacy standards such as GDPR. Therefore, companies need visibility and control over their data pipelines, while controlling costs and maximizing the value of their existing data analytics and automation solutions.

Dynatrace OpenPipeline gives business, development, security and operations teams complete visibility and control over their data ingestion, while maintaining the context of the data and the cloud environments from which it comes.

The solution enables these teams to collect, converge, route, enrich, dedupe, filter, mask and transform observability, security and business event data from any source - including Dynatrace® OneAgent, Dynatrace APIs and OpenTelemetry - with customizable retention times for individual use cases.

This allows organizations to manage the ever-increasing volume and diversity of data from their hybrid and multi-cloud ecosystems and empower more teams to access the Dynatrace platform's AI-powered responses and automation without additional tools need.

Benefits of working with other core Dynatrace platform technologies

Dynatrace OpenPipeline works with other core technologies in the Dynatrace platform, including Grail Data Lakehouse, Smartscape topology, and Davis hypermodal AI. This offers the following advantages:

  • Petabyte-scale data analysis: Leverages patent-pending stream processing algorithms to achieve dramatically increased data throughput at petabyte scale.
  • Uniform data collection: This enables teams to collect observability, security, and business event data from any source and in any format, including Dynatrace OneAgent, Dynatrace APIs, open source frameworks such as OpenTelemetry, and other telemetry signals.
  • Real-time data analysis during ingestion: This enables teams to convert unstructured data, such as logs, into structured and usable formats - such as converting raw data into time series, calculating metrics, or creating business events from log lines - right at the point of ingestion.
  • Full data context: The context of heterogeneous data points - including metrics, traces, logs, behavior, business events, vulnerabilities, threats, lifecycle events and many others - is maintained and reflects the different parts of the cloud ecosystem from which they originate.
  • Privacy and security controls: Users have control over what data they analyze, store, or exclude from analysis. The solution includes fully customizable security and privacy controls to meet customers' specific requirements and regulatory requirements, such as automatic and role-based hiding of personal data.
  • Cost-effective data management: This helps teams avoid collecting duplicate data and reduce storage requirements by transforming data into usable formats (e.g. from XML to JSON) and allowing teams to remove unnecessary fields without losing insights, context or Analysis flexibility is lost.

Five to ten times faster data processing

“OpenPipeline is a powerful addition to the Dynatrace platform,” said Bernd Greifeneder, CTO at Dynatrace. “It enriches, converges and contextualizes the heterogeneous observability, security and business data coming from the clouds and provides unified analytics across that data and the services it represents. Like the Grail Data Lakehouse, we built OpenPipeline for petabyte-scale analytics. OpenPipeline works with Dynatrace's Davis hypermodal AI to extract meaningful insights from data, enabling robust analysis and reliable automation.

According to our internal testing, OpenPipeline powered by Davis AI enables our customers to process data five to ten times faster than comparable technologies. Bringing together and contextualizing data within Dynatrace makes it easier to comply with regulatory requirements and conduct audits, while giving more teams within organizations immediate visibility into the performance and security of their digital services.”

More at Dynatrace.com

 


About Dynatrace

Dynatrace ensures that software works perfectly worldwide. Our unified software intelligence platform combines broad and deep observability and continuous run-time application security with the most advanced AIOps to deliver answers and intelligent automation from data at remarkable scale. This enables organizations to modernize and automate cloud operations, deliver software faster and more securely, and ensure flawless digital experiences.


Matching articles on the topic

Cybersecurity platform with protection for 5G environments

Cybersecurity specialist Trend Micro unveils its platform-based approach to protecting organizations' ever-expanding attack surface, including securing ➡ Read more

Data manipulation, the underestimated danger

Every year, World Backup Day on March 31st serves as a reminder of the importance of up-to-date and easily accessible backups ➡ Read more

Printers as a security risk

Corporate printer fleets are increasingly becoming a blind spot and pose enormous problems for their efficiency and security. ➡ Read more

The AI ​​Act and its consequences for data protection

With the AI ​​Act, the first law for AI has been approved and gives manufacturers of AI applications between six months and ➡ Read more

Windows operating systems: Almost two million computers at risk

There are no longer any updates for the Windows 7 and 8 operating systems. This means open security gaps and therefore worthwhile and ➡ Read more

AI on Enterprise Storage fights ransomware in real time

NetApp is one of the first to integrate artificial intelligence (AI) and machine learning (ML) directly into primary storage to combat ransomware ➡ Read more

DSPM product suite for Zero Trust Data Security

Data Security Posture Management – ​​DSPM for short – is crucial for companies to ensure cyber resilience against the multitude ➡ Read more

Data encryption: More security on cloud platforms

Online platforms are often the target of cyberattacks, such as Trello recently. 5 tips ensure more effective data encryption in the cloud ➡ Read more