This message was sent first to subscribers of Data Craze Weekly newsletter.
We often hear about interesting products, clever algorithms and… great earnings in large technology companies.
What we rarely hear about are the problems they face, which (like the scale of the company) are often large.
That’s why it’s even more worth reading what bugs the “great ones” of this world have to deal with.
This case from LinkedIn shows the scale:
Back in October 2018, we had an instance at LinkedIn when data quality problems affected the job recommendations platform. Client job views and usage decreased by 40 to 60% for a short period of time. Once this decline in views was detected, it took a total of 5 engineers 8 days to identify the root cause and 11 days to resolve the issue.
I have already written about what contracts are in the world of data in one of the previous editions of the newsletter.
In short, it is nothing more than an agreement between teams (usually Frontend - Backend (Data)) regarding the method (scheme) of transmitting data / results.
This time I would like to return to this topic with an example from GoCardless.
The article does not cover the most important technical issues, because they will be specific to the company itself (GoCardless).
What is more important is what has been achieved.
It has allowed us to build what we refer to as our contract-driven data infrastructure, where from a Data Contract we can deploy all the tooling and services required to generate, manage and consume that data.
Although the concept itself is not new, we will hear more and more about it in the world of data. Data quality is (always has been) crucial and any SDLC (Software Development Lifecycle) element that can help maintain it at the highest possible level will be eagerly used.
A few years ago, I devoted a lot of time in my daily work to data visualization.
Thanks to great tools, I didn’t have to create them from scratch.
My task was to best match the visualization to the story the data was telling… and at the end of the day it all ended up in Excel 😀 However, what has always fascinated me is the work of people who took visualizations to a different, many times higher level. If you work with data every day and one of your tasks is to visualize it, let yourself be inspired.
Pluralith – visualize terraform infrastructure, directly from your codebase completely automated.
Do you use Terraform to build infrastructure in your company/project? This tool will be great for visualizing it. It will show how the elements are connected to each other, all without major problems and additional work.
There is a paid option and a completely free one.
Create two equivalent (in the context of the result sets) queries. Queries should be a join of sales data (SALES table) and product data (PRODUCTS table) (after the PRODUCT_ID join key).
More SQL related questions you can find at SQL - Q&A
- Principal Data Architect, APPSBROKER – London, Swindon, Remote, Romania – £80,000 – £100,000 (B2B / Yearly)
Skills sought: GCP / AWS, ETL Tools (ex. Matillion), Data Architecture, SQL, RDBMS, Python
- Database Developer, PCMI – Kraków, Remote (Hybrid) – PLN 10,500 – PLN 14,500 (B2B / Monthly)
Skills sought: MS SQL Server, SSRS, SSIS, SQL