AdobeStock_270047776.jpeg

Data-Driven

@COGNITIVO DATA-DRIVEN MEANS

Using data to inform decision making and automate actions where the impact can be quantitatively measured.

Being data-driven means your organisation has closed the loop on data collection, action-taking and measurement.

You can continuously and nimbly optimise the actions you take within all areas of your organisation based on the data you collect, that data also allows you to automate human-like actions and interactions.

Data is your moat, it is not a feature that can be easily copied

modern Cloud Data Platforms

DATA Capability Building

What does a modern data capability mean for organisations in todays increasingly competitive and cost-conscience world? We’ve put a lot of thought and practice with our customers into that question! Every organisation is different, but there are some common themes that we can help make a reaility for you:

  • Modern characteristics- data platforms should be scalable, resilient and highly available. They should support real-time complex event processing, be multi-modal (options for NoSQL, relational, node and other databases). They should be simple to operate and control costs.

  • A place for experimentation - We understand that the needs of data scientists, engineers and app builders can place undue burden and performance challenges on traditional architectures. We can help you plan a isolated, cost-controlled and specialized pathways that insulate production environments from these burdens and accelerate the research efforts of your smartest people!

  • The rails to automate - The building blocks of your data platform should conform to industry standards and practices to enable robotic process automation (RPA), integration platforms and other digital workflow tools work seamlessley.

Integration Strategy and Architecture

No data capability is complete without a great integration platform to drive itsconsumption. Cognitivo works with you to plan the pipes and fittings needed to bring life to your new cloud data platform. Our experience working with the major public cloud providers combined and integration product vendors means that we can give you the best advice on how to deliver data to your customers and staff no matter the technology stack.

ref arch.png
Dark-Blue-Background.png

GEOSPATIAL

Roadside Asset Identification

Cognitivo’s ROADSIDE ASSET MANAGEMENT platform assists local councils identify, review and manage roadside assets through a combination of rules based logic and image recognition.

Turf Identification.png

Wi-Fi Localisation Analytics

Cognitivo works in collaboration with CSIRO’s Data61, to deploy the Wi-Fi localisation technology (location detection and telemetry) developed within the Cybernetics research group. This technology allows for anonymised sub-metre location of Wi-Fi receivers.

This technology has use cases in collecting rich population and telemetry data that can be used for population measurement or asset tracking both indoors and within challenging environments.

Refer to the following page for more information on this research area. Real-time Passive Tracking and Situational Awareness

Wifi Locallisation (small).png

Practial AI Solutions

Breathing Life into Receipts

The usefulness of receipts for merchants, retailers and financial instituions should not end as soon as they have been issued to customers. Through our research collaboration with the University of NSW and the efforts of our customer engagement team we have built machine learning models that mine, model and map information on ordinary receipts to industry standard categeories like GS1.

SKU Categorization Process

SKU Categorization Process

The use cases for these process are quite broad if connected to the right data feeds:

  • Merchants and retailers can understand how they are tracking compared to their peers in their geography, state and many other categories.

  • Merchants can create evidence-based promotions for their customers.

  • Retail chains can learn in real-time what is trending across categories, industry and geography.

Building an Engine for Retailers, Merchants and Financial Institutions

This research was promising, but we needed to put it to the test. Our engineers maintain a platform that enables real-time ingestion, processing and analytics of electronic receipts. We’ve also built a simple mobile app to scan paper receipts (but you can plug in your own app!).

We’ve als built out consumer facing and merchant facing APIs that can feed into your existing platforms and apps.

If you want to learn more about how this can work for your business, go ahead a contact us!

Overview of Cognitivo’s SKU Categorization and Analytics Engine

Overview of Cognitivo’s SKU Categorization and Analytics Engine

SKU-Cat-3.jpg
Dark-Blue-Background.png

UNIFIED DATA RISK MANAGEMENT

Cognitivo has an in-house developed data management framework which unifies 4 Data Risk Management topics unified under the ISO31000 standard for risk management:

  • Data Use and Quality – how does the organisation want to use, who should the data be shared with and what level of assurance do they need over the quality of that data.

  • Privacy & Confidentiality - What data needs to be explicitly restricted from certain parties for reasons of privacy and confidentiality

  • Retention & Disposal - How long we should retain certain data to, dispose of data we don’t need to either meet certain obligations or reduce storage costs.

  • Information Security - What our information security / access controls environment should be in enforcing the above business and policy objectives.

For example, compliance with GDPR’s right-to-forget requirement requires both a privacy and a retention. Your organisation cannot be dealing with these 4 areas of data risk in isolation.

Our approach utilises a fine-grain, attribute based approach to built to accomodate, customer-centric, API driven, cloud-based & zero/low trust architectures.

If you are still thinking about roles-based authentication and domain based data management (if you’ve been told to look at key data elements) your data management and information security

Refer to our blog post for more information on our unified data risk management framework or get in touch with our team.

data quality

Cognitivo’s Data Quality approach uses analytical approaches to determine the accuracy of data sets. This differs from most out-of-the-box data applications available on the market today which are focused on profiling data as part of a data migration (ETL/ELT) process.

To tackle the challenge of determine what data is correct, as opposed to what data is in the right format we deploy a number of techniques.

We deploy logical tests against analytical golden records constructed from multiple internal and external data sources (including documents). Statistical tests that check for reasonable values. For financial reporting, we build independent check-sums where we aggregate source data to build line-of-business, product or account level in order to be reconcilled to output report data.

Our overall approach to DQ follows the following principles:

  • Build a risk-based process-centric DQ (data quality) process Analytical approach to DQ building custom business rules specific to any organisation

  • Metadata driven system-agnostic approach to building out DQ rules leveraging Cognitivo’s industry-specific business conceptual models (we have coverage over local government and financial services sectors)

  • Build platform capabilities and skills for DQ that can also be leveraged in the future for analytics

  • Establish an ongoing remediation capability integrated back into process owners, enforced by Data Quality Assurance processes

  • Establish data governance standards and processes to minimise future DQ issues

  • Provide a modular approach which allows our clients to iteratively mature their DQ capability

  • Minimise software license cost and vendor software lock-in by utilising industry stand open source technologies.

Read our blog article on Tacking the Enterprise DQ Challenge or speak to us if you are interested in seeing some sample Data Quality Dashboards we have developed.

Data Quality.png
Data Quality Arch.png
Black Background.png

Privacy ReIdentification Risk Assessment

The Australian Privacy Act requires those sharing or releasing data to mitigate the risk until there is no reasonable likelihood of re-identification occuring.

OAIC recommends that (APP regulated) entities take a risk-management approach when handling de-identified data which acknowledges that while the APPs may not apply to data that is de-identified in one specific context, the same data could become personal information in a different context.

Robust de-identification governance practices may include activities such as:

  • ongoing and regular re-identification risk assessments (to check that methods used are still effective and appropriate at managing the risks involved)

  • auditing data recipients to ensure that they are complying with the conditions of any data sharing agreements

Cognitivo can help implement a data risk management framework and deploy tools to periodically (and on event-driven basis) assess or quantify the re-identification risk of your shared datasets.

Cognitivo is an authorised partner reseller of CSIRO’s Data61 Re-identification Risk Ready Reckoner.

R4 is a risk assessment tool designed to help evaluate the potential for re-identification of records in datasets (including 'de-identified' datasets), so as to support decision making on what data can be shared in what context. The tool examines the risk of re-identification for single attributes and combinations of multiple attributes in the dataset, and presents a dashboard to view overall risk and examine problematic attributes or records in finer detail. Its graphical user interface and risk ranking provide a one-look view of the re-identification risk of a dataset, and allows easy drill-down to the most relevant data affecting that risk.

R4 also simplifies the process of preparing a dataset for sharing or release by highlighting problematic records, and offering mitigation methods such as aggregation and perturbation to be applied to chosen attributes. Once a mitigation is applied R4 re-analyses the modified data, so that it can be used in a cycle of risk mitigation and assessment until the residual risk is considered acceptable.

R4 helps data custodians and managers better understand the risk of re-identification so as to make informed decisions about their data, and to reduce that risk through treatment of problematic attributes or records.

Refer to CSIRO’s Data61 R4 website for full description of product features.

Let us know if you would like to find out more or would like a demo of the R4 tool

 

methods & case studies

Let us know a little bit about the digital challenges your organisations face and we will come back with some relevant ideas and case studies to discuss with you.

Name