Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • Dr Mehmet Yildiz

    Vital Technical Points for Data Solutions In Digital Ventures

    2021-04-19

    Data sources in digital ventures keep changing. They come from multiple modern and legacy sources. Every day new data sources become available, and some data sources get decommissioned.

    https://img.particlenews.com/image.php?url=26EGOJ_0ZKeR3wb00

    Photo by Charles Deluvio on Unsplash

    Technical leaders of digital ventures leverage architectural and design thinking skills, established and emerging technology stacks, project management methods, and many digital enablement tools to create customer-focused solutions for digital transformation goals. These bespoke solutions can manifest as digital products or services depending on the purposes and the scope of the transformation initiative.

    In these bespoke solutions, there is one critical item that requires special attention from all stakeholders. It is data solutions. Particularly Big Data solutions pose distinct requirements. These solutions require additional expertise beyond technical teams.

    Apart from essential architectural considerations by venture technical leaders, these solutions also require domain knowledge of data and information management expertise. At the highest level, technical leaders responsible for data platforms need to identify optimal approaches to collecting, storing, processing, analysing, and presenting Big Data. However, practical solutions covering these broad processes must be architected by specialized Big Data architects, experienced information architects, and experienced information management professionals.

    Big Data solutions require heterogeneous technology stacks and tools to fit the purpose of the digital transformation solution. It is essential to realise that no single technology or tool can provide everything for developing Big Data solutions. There may be some marketing pitch for a one-tool-fits-all solution, but they all fall short for delivering end-to-end solutions when looked at closely from my experience. I haven’t seen such a tool during many decades of experience.

    Due to their dependencies and relationships to many components, attributes, and factors, Big Data solutions cannot be developed in isolation and silos.

    Digital technology leaders need to consider the entire ecosystem and break the silos that think in isolation and focus on integration and federation. Integrated architectural factors can affect the venture as a whole, not such a single initiative as far as data solutions are concerned.

    Data platforms and Big Data solutions require flexible capacity and highly scalable systems, processes, technologies and tools from an infrastructure perspective. Scalability and capacity management are fundamental requirements for Big Data solutions.

    Even in a small amount, compromising scalability and capacity requirements can cause undesirable situations, troubled projects, and failed service levels when the solution was in production. Scalability and capacity requirements must be taken into consideration at the early phases of developing the solutions.

    The modularity of the solution plays an essential role in scalability and capacity management. When designing modularity requirements, the key consideration is that the modules of the solutions (building blocks) must fit into the big picture of the solution. For example, the same data could be used by different initiatives, projects, and services rather than creating unnecessary data access silos in the venture. Failing modularity can pose financial implications, customer satisfaction issues and even may cause regulation non-compliance.

    Big Data solutions in digital ventures require thinking out of the box. The entire team needs to approach solutions as innovators. The technical team must understand the intricacies of the latest data management and compliance technologies.

    For example, there is a trend in the industry for trying new data analysis methods without binding to traditional electronic data warehouse (EDW) resources and ETL (Extract, Transform, Load) processes. ETL refers to copying data from one or multiple sources into a destination system that represents the data differently from the source and in a different context than the source

    The venture technical team must consider mixing open source and commercial systems based on their applicability and meeting the requirements in terms of data management tools and analytics technologies.

    For example, OLTP (Online Transactional Processing) can be designed using commercially available relational databases for structured and open-source Apache Casandra databases supporting semi-structured databases. OLTP is a data processing technique that can run transactional tasks like inserting, updating, deleting data in database files. The OLTP process is commonly used in finance, retail, and customer relationship data solutions.

    Data sources in digital ventures keep changing. They come from multiple modern and legacy sources. Every day new data sources become available, and some data sources get decommissioned.

    Another key consideration is determining the timelines of data ingestion in the venture from solution readiness and quality management perspectives. Data ingestion is the process of importing, transferring, loading, processing, and storing data for use. Data ingestion is a critical aspect of Big Data analytics in the modernisation and transformation context in digital ventures.

    Data ingestion can be on a synchronous, asynchronous, or real-time basis. The data architecture team needs to articulate the selection of these options with compelling business reasons. They need to obtain validating input and approvals from data subject matter experts, business stakeholders, and the solution governance body.

    As it is vital to choose the type of processing to perform, whether real-time or batch processing, data processing may involve descriptive, predictive, prescriptive, diagnostic, and ad-hoc. To meet these analytics requirements, the technical team needs to factor in the latency expectation of processing from business sponsors. These factors can play an essential role in the success of data solutions.

    After ingestion, the next point to consider is data access. Data access can be in random or sequential order. Data access patterns require careful and detailed visualisation in the solution planning phase. Data access patterns are necessary to optimise data access requirements.

    There are many patterns available in data application integration and interface body of knowledge. For example, some common patterns are accelerating database resource initialisation, eliminating data access bottlenecks, and hiding obscure database semantics from data users.

    The database optimisation process requires careful consideration at various stages. Using optimisation techniques can improve the quality and speed of data access, read and write activities. Some critical optimisation considerations are using appropriate indexes, removing unnecessary indexes, and minimising data transfers from client to server.

    These are very high-level data lifecycle management considerations in digital ventures. These points can be considered only the tip of the iceberg in developing Big Data solutions for digital transformation initiatives. The devil is in the detail for these items covered in this article.

    Digital technology leaders don’t have to go into the details of each building block. However, they need to be aware of these items and ensure the architecture, design, and technical specialist teams consider them and constantly create solutions with transparent input from business stakeholders.

    The crucial role of the digital technology leader is to break silos and facilitate integrated data solutions as data in digital ventures are the most complex part of the solution. If data is compromised and not used properly, many aspects of the venture are affected from financially and customer satisfaction perspectives.

    Requirements for data solutions are dynamic. They keep changing based on industry, initiative goals, customer expectations, and many other factors that can be beyond the controls of the architecture and design team.

    Therefore, technology leaders must encourage the core and extended team to use established methods, re-usable intellectual assets, proven processes, purposeful technologies, and well-supported tools to produce successful data solutions for the digital venture.

    Thank you for reading my perspectives.

    Related articles on News Break

    What Does Digitally Intelligent Mean?

    10 Critical Tips To Unfold Digital Intelligence

    Financial Considerations For Digital Ventures

    A Methodical And Innovative Approach to Digital Venture Cost Management

    Effective Use of Innovative And Inventive Thinking For Digital Ventures

    Smart Simplification For Business And Market Competition

    Accelerated and Pragmatic Approaches In Digital Ventures

    Collaborative Intelligence And Fusion Culture In Digital Ventures

    Creating Trust And Credibility In Diverse Digital Ventures

    Why The Cloud Services Matter To Digital Ventures

    Digital Ventures Can Save Money And Get Work Done Fast With Open-Source

    Leveraging Ethical Hacking for Cybersecurity Requirements of Digital Ventures

    Expand All
    Comments /
    Add a Comment
    YOU MAY ALSO LIKE
    Local News newsLocal News
    Emily Standley Allard24 days ago
    West Texas Livestock Growers12 days ago
    Total Apex Sports & Entertainment11 hours ago
    Morristown Minute28 days ago

    Comments / 0