Case Studies

Below are some examples of typical projects that we have worked on over the years.​


Manually Reviewing Content and Updating a B2B Portal


A Dutch B2B portal that provides general business information saved more than $7.6 million by outsourcing to Tapestry Data in 2019 alone.  Tapestry Data performs data retrieval, extraction and manual update and helped them grow from startup to one of the Netherlands' leading B2B information sources.


Our team performs 85% of their manual updates and assists with:

  • Monitoring specific sites for new financial and company information

  • Extracting key indicators from published documents

  • Reading notes in financial statements and intelligently collecting interesting points

  • Validating information provided by automatic recommender systems

  • Annotating machine learning training sets

In 2019, we:

  • Processed more than 230,000 financial reports

  • Extracted information from 50,000 company legal documents

  • Updated key information for over 250,000 companies


Manually Tagging News Articles

Example 1:

Het Financieele Dagblad, a Dutch daily financial newspaper with a circulation of about 50,000, needed assistance classifying articles in order to personalize readers' online experience.  For four years, Tapestry Data's team read and associated all of their news articles with keywords 17 hours a day, six days a week.  We always completed this work while it was in final draft and, for breaking news, no later than ten minutes after publishing.

In total, the team annotated over 100,000 articles and built a high-quality training set which we used to build an automated tagging process.

Example 2:

Tapestry Data saved a mid-sized Dutch company $500,000 by reviewing and tagging 4,000 news articles per day in order to generate alerts for customers on the companies they follow.  Our team of seven full timers tagged articles with relevant company, sector, and topic creating the training set that, after four years, served as input for an automated recommender system.


Researching Data Sources


Tapestry Data is helping senior analysts create and refine search queries and categorize information for a Series A European startup called Owlin that has a news analytics tool for finance professionals.  We have helped reduce our client's research costs by 60%.


Boosting Data Quality in Payment Processing


Our client:

Problem: Financial institutions experience 15% to 25% loss of revenue due to poor data quality, scattered across Transaction Systems, Corporate Databases, News and Social Media Feeds, ERP/CRM Infrastructure, Legacy Systems, Cloud Applications and Cloud Saas.


Solution: AI-based software boosting data assets’ quality controlled from a single Web-based interface. 


• Integrate internal and external sources 

• Classify and combine data points from nonunified and unstructured data environments. 

• Deduplicate complex data arrangements, based on customizable rules. 

• Migrate nonintegrated and divergent databases. 

• Create simple and clean structures from high-complex datasets. 

• Prepare data for ML/AI implementation.   


Results: One of the biggest receivables management companies in Europe received a fully-featured SaaS solution that was integrated across several departments. The company's newly fortified data management platform improved relationships with regulators and trust with customers, who want their account managers to be vigilant in identifying questionable financial transactions and unrealized payments.

•  Improved quality ratings of customer support from 80 percent to over 95 percent

•  Increased success rate of of payee identification.

•  Reduced existing backlog of uncollected receivables by completing 26,000 in just 12 weeks.

•  Reduced risk of regulatory non-compliance, reputational damage and potential financial crime losses.   


Improving Quality and Performance in Asset Management

Our client: A Global Asset Management firm

Problem: Our client was challenged with handling a growing range of investments and visibility across the wide range of asset types in its portfolio, including a multitude of global stocks, OTC derivatives bonds, currencies, and potential new products of interest. The team needed to eliminate time intensive manual reporting processes, while adjusting analytics and results based on their view of the global markets.

Solution: We delivered an integrated performance, analytics, attribution, and risk solution to improve data analytics and streamline reporting. The solution is trusted by an ever-growing number of global asset managers to enhance their investment strategies and better service their clients, while maintaining high levels of security, risk management, process control and operational scale.

Results: Тhe global multi-asset class manager has increased data quality by 30% and reduced reporting times by as much as three days each month, while freeing hundreds of working hours for analysts to focus on client needs.


Thanks to our portfolio reconciliation service based on neural network algorithms, the client also benefited from more insightful modelling analytics and broader security coverage on all assets. We continue to provide transparency into the product road map and assist team’s development requests, throughout our dedicated team of our specialists. Additionally custom functionality to calculate return attribution was implemented for fixed income positions using their own yield curves.


Building Deep Learning Systems for Early Cancer Detection

Problem: Early brain cancer detection prolongs patient survival and delays recurrence.  It is, however, difficult to differentiate between cancerous and non-cancerous tissue, especially in transitional, infiltrative zones. About one fifth of misses on ultrasound and magnetic resonance imaging (MRI) are the result of interpretive error.  Automated screening, computer-assisted detection, and diagnosis assistance can improve detection rates.

Solution: Tapestry Data accelerated and improved the chances of successful drug development and clinical care for brain diseases by developing an automated image analysis tool that employed deep learning algorithms on large numbers of MRI and CT brain scan images.

Results: The result was a perfect environment to facilitate objective decision-making process, based on imaging data insights, while at the same time storing multi-site data for future clinical studies and trials.


The first trial consisted of approximately 1,700, 512x512-pixel images collected from 874 patients.  These images were classified in 5 groups and achieved up to 95% specificity, 30% sensitivity and 75% accuracy.  Based on our initial results, multiple Neuro AI algorithms have been developed for detection of Multiple Sclerosis, Alzheimer’s Disease, Parkinson’s Disease, Brain Oncology, Neuropsychiatric Diseases and Stroke.


The use of our platform results in: ​

  •  60% cost savings in imaging collection and storage 

  •  90% cost savings in advanced image analysis 

  •  80% time savings by automated advanced image analysis 

  •  75% site burden reduction.