Handling physical deletes from the source and continue populating your analytical data store

(2023-Oct-15)  When something important disappears, it's natural to start asking questions and looking for answers, especially when that missing piece has had a significant impact on your life. Similarly, when data that used to exist in your sourcing system suddenly vanishes without any trace, you're likely to react in a similar way. You might find yourself reaching out to higher authorities to understand why the existing data management system design allowed this to happen. Your colleagues might wonder if better ways to handle such data-related issues exist. Ultimately, you'll embark on a quest to question yourself about what could have been done differently to avoid the complete loss of that crucial data. Paul Felix wrote a good article on “ Dealing with Deletes in the Data Warehouse ” from which I will borrow a few terms to differentiate types of data deletions: Soft deletes – no physical data deletes are occurring; data records are simply tagged and deactivated. Hard

Azure DevOps: Enterprise Power BI report deployment with connections to Shared datasets

Adding microseconds to a timestamp in Azure Data Factory

Using Azure Data Factory to read and process REST API datasets

Metadata-driven pipelines in Azure Data Factory | Part 4 - Analytical Processing

Thinking about data points and more …

Azure DevOps: Deploying Power BI reports with a parameterized gateway-based Data Source

Azure DevOps: Merging code to Main branch from a specific branch only

Including reference data into a database deployment process or passing lookup tables to Production

Metadata-driven pipelines in Azure Data Factory | Part 3 - Column Metadata

Populating PostgreSQL JSONB column using Azure Data Factory Data Flow

Metadata-driven pipelines in Azure Data Factory | Part 2 - Feed Configuration

Metadata-driven pipelines in Azure Data Factory | Part 1 - Data Copy

Fixing a giant or running a SQL Server project deployment with Temporal tables

Trusting the Fine Print or connecting Azure Data Factory with Salesforce

Can I create a CI/CD pipeline to deploy Python Function to Azure Function App using Windows self-hosted Azure DevOps agent?

“Could not find the modules: 'Az.Accounts' with Version: ''” error message and a story to remember

Fail activity in Azure Data Factory and Why would I want to Fail

Error [IM002] [Microsoft][ODBC Driver Manager] "Data source name not found and no default driver specified" and who do you trust?

Lambda Architecture in data systems and possible meaning of this name

Data Modeling at your fingertips

New Azure Data Factory home page, what's in it for me?

Executing Azure Data Factory pipelines by Power App / Automate Flow

Azure Function Drain mode

Blog post about Copying Azure files to SharePoint, comparing Logic App and Power Automate Flow for this, and Trust No One

Creating a generic (template) pipeline in Azure Data Factory to send email messages with attached files from Azure Storage account

Recent updates on my work with Data Vault and Azure Data Factory

Using Logic App to send multiple attachments from Azure Storage Account