Solutions
About Us
Insights
Careers
Contact us
Contact Us
Customer Support
Customer Support

Rethinking Evidence in Rare Disease Research: A Case Study Using Propensity Score Methods

Rare diseases pose unique challenges for researchers and clinicians. Due to small patient populations, conducting randomized controlled trials (RCTs) is often impractical or ethically difficult. As a result, observational data becomes a key source of evidence.

In the landscape of rare disease, data is both our most precious resource and our greatest challenge. For conditions like Infantile-Onset Pompe Disease (IOPD), the journey from the first life-saving Enzyme Replacement Therapy (ERT) to the next generation of optimized treatments is rarely a path free of challenges. It is a path marked by small patient populations, high clinical variability, and the heavy weight of every data point.

The difficulty in rare disease research often lies in the “how”: How do we prove a new therapy is truly superior when baseline functional levels vary so wildly? How do we ensure that a single data entry error doesn’t mask a breakthrough or suggest a false decline?

In this blog, we explore how propensity score methods can be used to estimate treatment effectiveness in a rare disease setting through a real world–inspired case study.

In this case study, we pull back the curtain on the analytical rigor required to compare motor function trajectories in IOPD. From Propensity Score Matching to “red-flag” data auditing, we explore how sophisticated analysis turns fragmented data into a clear roadmap for the future of neuromuscular treatment.

 

Case study: Advancing motor function outcomes in IOPD

The evolution from first-generation drug to next-generation drug

Infantile-Onset Pompe Disease (IOPD) is a rare, progressive neuromuscular disorder. While the first generation of ERT revolutionized survival, the quest for superior motor function remains the “North Star” for researchers. This study compares longitudinal motor outcomes between the First-Generation Drug and Next-Generation Drug cohorts using the Gross Motor Function Measure (GMFM-88).

 

The challenge: Comparing across clinical trials

Comparing results from different studies requires more than just looking at averages; it requires accounting for the inherent variability in how patients present at baseline. To test the hypothesis that the Next-Generation Drug offers a superior motor trajectory, we implemented a rigorous three-tier analytical approach.

 

A three-tier analytical approach

1. The power of precise matching

To ensure an “apples-to-apples” comparison, we restricted the analysis to patient pairs matched by both age and baseline functional level.

  • The criteria: Matches were strictly filtered to those within a +/- 13-point window of the GMFM-88 raw score (rather than a percentage).
  • The goal: By tightening these parameters, we eliminated “baseline noise,” allowing the true pharmacological impact of the treatment to surface in the longitudinal graphs.

 

2. Data integrity: Investigating the “jumps and drops”

In rare disease registries, a single data point can skew an entire trajectory. Our team conducted a “deep dive” into five specific patient profiles that exhibited extreme volatility — marked by sharp drops or vertical jumps in scores.

Expert insight: A drop to zero isn’t always a clinical decline; often, it’s a data entry artifact where a missing value was defaulted to ‘0.’ By identifying and correcting these anomalies, we ensure the motor trajectory reflects biology, not a spreadsheet error.

 

3. Sophisticated balancing: Propensity Score Matching (PSM)

Propensity score methods help simulate a randomized experiment by balancing observed characteristics between treated and untreated groups.

To further validate our findings, we moved beyond simple matching to Propensity Score Matching. This statistical technique allows us to predict a patient’s likelihood of being in a specific treatment group based on their baseline characteristics, effectively “balancing” the two groups.

 

Key covariates included:

  • Baseline status: Age and GMFM-88 total raw score.
  • Clinical history: Age at diagnosis and age at start of ERT.
  • Biological markers: CRIM status (Cross-Reactive Immunologic Material) and LVMI (Left Ventricular Mass Index) z-scores.
  • Treatment variables: Specific enzyme dosage levels.

 

Why this matters for the rare disease community

This case study demonstrates that in the world of rare diseases, how we analyze data is as important as the data itself. By correcting for entry errors and using high-fidelity matching, we can more clearly see if the next-generation drug truly provides the “superior trajectory” hypothesized.

 

Precision analytics as a catalyst for care

By applying high-fidelity matching and propensity score modelling, we move beyond “average” results to understand the true potential of new interventions. Furthermore, our dedication to data integrity — manually investigating anomalies and “red-arrow” outliers — ensures that our conclusions are built on a foundation of clinical reality rather than administrative error.

Ultimately, this study reinforces that in the fight against rare diseases, data is our most powerful ally. When we refine our lens through rigorous matching and clean data, the path toward better motor function and brighter futures for IOPD patients becomes clearer than ever.

Collective Leadership Models Emerging in the Life Sciences

Collective Leadership at PHUSE APAC Connect and Beyond

In clinical research, structure defines much of how we operate. We work within protocols, regulatory frameworks, statistical hierarchies, and governance models. Accountability is clear. Escalation paths are defined. Ownership is documented. Structure gives us control, but participation gives us adaptability. And in today’s life sciences environment, adaptability matters more than ever.

As PHUSE APAC Connect kicks off its inaugural edition, what stands out is not simply its expansion into a new geography, but rather the way it is being built. It is being shaped collectively by stream leaders, contributors, presenters, and sponsors who have chosen to engage because they care about advancing clinical data and analytics.

PHUSE APAC Connect reflects something larger than an event. It reflects a shift in how influence works in our industry: influence is becoming more distributed, and leadership must evolve accordingly.

The community convenes not because it is instructed to, but because its members understand that progress in complex systems is co-created. Leadership through participation is no longer an abstract idea. It is how real progress happens.

 

Complexity has changed the rules

Over the past decade, the life sciences landscape has changed in meaningful ways:

  • Clinical programs span continents
  • Data volumes have expanded dramatically
  • Regulatory expectations continue to evolve
  • Digital transformation is no longer a roadmap, it is daily reality

We now operate within interconnected ecosystems rather than isolated silos. A trial design decision in one region influences submission strategy in another. An analytics innovation within one capability center can reshape processes globally. In such a system, centralized control has limits — contribution does not.

Participation is not symbolic; it has practical impact:

  • It shortens decision cycles
  • It enables faster knowledge sharing
  • It strengthens collective memory
  • It reduces vulnerability when complexity increases

Alignment in environments like ours cannot simply be mandated. It must be built. As Peter Drucker once said, “The best way to predict the future is to create it.” In our field, that creation happens through consistent collaboration. It happens when experienced professionals step forward, share openly, and help others navigate complexity.

Working together turns expertise into progress.

 

A parallel evolution: GCCs beyond arbitrage

In my recent white paper, “Beyond Cost Arbitrage: How Global Capability Centers Are Becoming Engines of Life Sciences Innovation,” I explored a transformation that closely parallels this shift.

Global Capability Centers (GCCs) were once primarily positioned around cost and scale. They were designed to optimize labor economics and expand operational capacity. That model delivered value in an earlier phase of globalization. Today, that view no longer captures the full picture.

Across life sciences, GCCs have matured into integrated capability hubs. They bring together clinical scientists, statisticians, regulatory specialists, advanced analytics teams, and digital engineers. They influence submission strategy, automation initiatives, and enterprise transformation efforts.

The most meaningful shift I observed was not structural. It was psychological. Leaders within these centers began to see themselves not as recipients of strategy, but as contributors to it. That shift changes the dynamic entirely.

When capability centers help shape standards, architecture, and innovation priorities, they move from supporting enterprise strategy to strengthening it. The center of gravity becomes more distributed, and with it, so does leadership.

That same redistribution of influence is visible in communities like PHUSE APAC Connect.

 

Collective stewardship in data standardization

Data standardization provides another perspective.

Standards do not evolve because they are declared. They evolve because experienced practitioners examine them, question them, refine them, and test them across real-world applications.

Respected contributors in this space, including colleagues such as Angelo Tinazzi, demonstrate how credibility is built over time through sustained engagement. Consistent participation in standards forums and industry dialogue reinforces an important principle. Influence in data and standardization is earned through contribution.

In global standardization efforts, credibility compounds gradually:

  • Participation builds trust.
  • Collaboration builds alignment.
  • Alignment strengthens regulatory confidence.

Shared stewardship of standards is not an idealistic concept; it is central to ensuring submission quality and regulatory trust.

 

What this means for Cytel

For us at Cytel, this discussion is more than conceptual.

We operate at the intersection of science, statistics, and regulatory strategy. Our work shapes trial design decisions, submission readiness, analytical rigor, and ultimately patient outcomes.

In that context, expertise alone is not enough, engagement matters. Participating actively in communities like PHUSE helps us stay aligned with evolving expectations, exchange knowledge across regions, and contribute meaningfully to broader industry progress.

As capabilities become more globally distributed, leadership must become more inclusive and collaborative. Participation is not an extension of our strategy; it sits at its core.

Collective leadership strengthens resilience. It increases learning velocity and helps organizations adapt with confidence in an environment that continues to evolve.

 

From regional milestone to industry signal

PHUSE APAC Connect represents more than a regional milestone. It signals that APAC, supported by expanding GCC ecosystems and deep domain expertise, is not simply a delivery geography. It is an active contributor to global thought leadership.

When professionals volunteer their time to shape agendas, share implementation insights, and mentor emerging talent, they strengthen the connective tissue of the industry.

Leadership does not weaken when it is shared. It becomes more durable. In distributed systems, shared ownership strengthens outcomes.

 

Closing reflection

Across this industry, one observation continues to hold true: titles define reporting structures, but participation defines influence. In complex environments, authority may initiate progress, but contribution sustains it.

The future of clinical data and analytics will be shaped by those who consistently engage, who collaborate across boundaries, and who invest in strengthening the ecosystem around them.

Leadership is not something granted once. It is something practiced repeatedly and participation is how it is practiced at scale.

 

Interested in learning more?

Download my new white paper, “Beyond Cost Arbitrage: How Global Capability Centers Are Becoming Engines of Life Sciences Innovation”:

Clinical Data Management’s Next Evolution: From Data Stewardship to Data Intelligence

Clinical Data Management (CDM) is undergoing a fundamental transformation. What was once primarily a function focused on data collection, validation, and cleaning is now emerging as a strategic, technology-driven discipline at the heart of modern clinical research.

Today’s trials generate unprecedented volumes of complex data. A recent Tufts Center for the Study of Drug Development survey found a 7x increase in data points and 4x increase in data sources. Here at Cytel we have seen studies with over 20 data sources. Beyond traditional electronic data capture (EDC), clinical studies increasingly incorporate electronic health records (EHRs), wearable devices, mobile applications, genomics, imaging, and real-world evidence (RWE). While these data sources create enormous potential for deeper insight, they also introduce new challenges that conventional CDM approaches were never designed to handle.

To unlock the value of this expanding data universe, clinical organizations must rethink not only their tools, but also their talent, workflows, and mindset.

 

The rise of new roles in clinical data management

This evolution has created demand for new, specialized roles that bridge clinical knowledge, data science, and technology:

 

Clinical Data Scientist (CDS)

Clinical Data Scientists focus on extracting insight from complex medical data. They apply advanced analytics, visualization, and domain expertise to uncover trends, assess data quality risks, and support clinical and operational decision-making.

 

Clinical Data Engineer (CDE)

Clinical Data Engineers design and maintain the data infrastructure that makes modern analytics possible. They build robust, compliant data pipelines, integrate diverse data sources, and ensure data is reliable, traceable, and analysis-ready across the clinical trial ecosystem.

 

Together, these roles move CDM beyond data stewardship toward true data enablement.

 

The expanding complexity of clinical data

Modern clinical trials are no longer linear or siloed. Data flows continuously from multiple sources, often in near real time, and in formats that vary widely in structure, granularity, and reliability. Managing this complexity requires more than rule-based checks and manual reviews. Organizations need scalable data architecture, advanced analytics, and intelligent monitoring approaches that can adapt as data volume, velocity, and variety increase. This shift marks a move away from reactive data cleaning toward proactive data intelligence.

 

Why data visualization matters more than ever

As data points multiply, traditional listings and static reports quickly become unmanageable. Data visualization is no longer a “nice to have,” it is essential. Advanced visual analytics enable clinical teams to identify patterns, compare data across sites, and detect emerging issues early, before they compromise data quality or timelines. By transforming complex datasets into intuitive visual insights, teams can move faster, ask better questions, and focus attention where it matters most.

 

Figure 1: Early Detection of Data Quality Risks through Data Visualization Use Case

Systemic audit trail analysis and regulatory expectations

Regulatory expectations are also evolving alongside data complexity. The 2023 EMA guidance places increased emphasis on audit trail review, signaling a shift from point-in-time checks to systemic analysis. Manual audit trail reviews are no longer sufficient at scale. Instead, sponsors and CROs must adopt analytical approaches that continuously monitor audit trail activity while identifying unusual patterns. This will support site fraud detections, risk-based quality management, and inspection readiness. Analytics-driven audit trail review not only improves compliance, but it also strengthens overall data integrity and operational oversight. In short, the audit trail data needs to be treated similarly to clinical data. In 2025, Cytel was made aware of multiple sponsors being asked to provide evidence of a systematic review of the audit trail data by regulatory authorities.

 

Figure 2: Systemic Audit Trail Analysis Use Case

From comprehensive reviews to trend and outlier detection

In a world of big data, reviewing everything is neither practical nor effective. The future of data cleaning lies in intelligent prioritization. By leveraging statistical methods and trend analysis, CDMs can shift from exhaustive data review to targeted investigation focusing on outliers, inconsistencies, and meaningful deviations. This will reduce manual effort while improving data quality outcomes, aligning with risk-based monitoring principles, and enabling faster, more confident decision-making throughout the trial lifecycle. This is accomplished by statistically analyzing the data variability similar to how statistics are used to evaluate for safety and efficacy and assigning risk levels to the various checks that are performed. An overall risk level is also created and based on the analysis targeted data checks are performed.

 

Figure 3: Risk-Based Data Cleaning Use Case

Building insight-ready clinical data ecosystems

The future of clinical data management is not defined by a single tool or technology, but by an ecosystem; one that combines modern platforms, advanced analytics, and specialized talent.

Organizations that invest in insight-ready data architectures and deploy the right expertise will be better positioned to improve data quality, accelerate timelines, and generate deeper insights from increasingly complex datasets. As clinical research continues to evolve, CDM’s role is expanding from managing data to unlocking its full strategic value.

 

Interested in learning more?

William Baker and Jenn Sustin will be hosting the webinar “Enabling the Shift to Clinical Data Science and Engineering for Modern Trials” on February 18 at 10 am ET:

2025 in Perspective: Reflections From Our Newest Colleagues

Every year brings new faces, fresh ideas, and inspiring stories to Cytel. In 2025, these colleagues joined us from across the globe, each bringing unique experiences and ambitions. As the year closes, we asked them to share what stood out, what they’ve learned, and how they see their work shaping something bigger. Their reflections tell a story of connection, growth, and purpose.

 

Joining Cytel: Memorable moments and settling in

For Kasum de Souza Mateus (Senior Biostatistician, FSP) the most memorable part of joining Cytel was simple yet meaningful: “Being able to meet colleagues and mentors in person.” That feeling of connection resonated with many new Cytelians, from Adish Jindal (Senior Recruiter), who described the joy of reconnecting with familiar faces, to Luke Hilliard (Event Manager), who fondly recalls a team meeting: “I really enjoyed the trip to Bruges. It was such a pleasure meeting everyone in person. We came away with some fantastic ideas that we’ve since put into action for our events.”

Others found their defining moments and success in challenges that brought people together. Kanchan Kulkarni (Manager, Accounting) stepped into her role during a major system transition: “One of my most memorable experiences has been leading the Global GL Accounting function across EMEA, APAC, and NA regions during our Oracle ERP transition. It wasn’t just about systems and numbers — it was about connecting people, aligning processes, and building something stronger together.” And for Scott Rogers (CFO), the most powerful moment came during a Town Hall: “I was very moved by the presentation where we heard directly from a patient and understood how our work helped him realize the benefits he was seeing.”

For Macarena Pazos Maidana (Senior Market & Business Development Manager) success came early: “During my third week, I successfully secured a key renewal with a major pharmaceutical client for the East Horizon™ platform. This achievement not only boosted my confidence but also reinforced my belief in the value our solutions bring to the industry.” And Hannes Engberg Raeder (Principal Biostatistician, FSP) found pride in collaboration: “I’m proud of having been able to support one of our partnerships through process improvements that helped strengthen collaboration and overall efficiency.”

 

Leaning on advice

Of course, starting something new means leaning on advice from colleagues or mentors, and some words of wisdom stuck. Nicole Sheridan (Manager, Talent Management) shared the famous mantra that shaped her approach: “’Do or do not, there is no try.’ It’s simple, but it completely changed how I think about my work and even life outside of work. I realized it’s not about being perfect but it’s about showing up, committing, and seeing things through. That mindset has really helped me take initiative, stay resilient, and turn ideas into results.

Damian Kowalski (Principal Statistical Programmer, FSP) emphasized collaboration: “Don’t be afraid to ask questions. Collaboration is our strength.” And Sydney Jenkins (Senior Employee Relations & Engagement Partner) shared a perspective that guides her work: “Trust your logic. That perspective reminds me to approach challenges with a clear, rational mindset, even under pressure!”

 

Growth and ambition

This year was not only about settling into their role for our new Cytelians, however. It also marked a year of growth and achievements. Adish honed his global recruitment expertise: “One skill I’m particularly proud of developing in 2025 is my ability to manage global recruitment processes more effectively.” Monica Chaudhari (Associate Director, Biostatistics, FSP) shared a technical milestone: “My first study that I got assigned to was already closed. To help myself support the team through database lock, review of final outputs and drafting of the CSR, I created a swimmers plot summarizing all important endpoints on each subject’s trajectory that helped identify major deviations.”

Valeria Duque Mora (Project Coordinator, Resource Management) reflected on teamwork: “My current team has made a real difference in my daily work. They are the foundation of our success, always supporting each other and sharing new information with kindness and collaboration throughout every process.” For Dominika Wisniewska (Senior Statistical Programmer, FSP), the impact was deeply personal: “I am grateful that Cytel gave me the opportunity to work directly for our client where I work on research within rare diseases and neurology diseases. I am particularly interested in neuro because of personal reasons, and I am happy to participate in maybe discovering new treatments.” And Sankhyajit Sengupta (Senior Statistical Programmer, FSP) embraced learning: “In this very short period of time (three months), I’ve had the opportunity to gain exposure to R programming in live studies and also completed required trainings on R, an important step as the industry is moving in this direction.”

Looking ahead, our new colleagues are already thinking about how to make an even bigger impact in 2026. Kanchan hopes to drive automation and efficiency, Luke dreams of organizing a standalone event, and Ye Miao (Associate Director, Biostatistics, FSP) plans to deepen expertise in R programming to contribute more effectively to data analysis and reporting tasks in his FSP role. Sydney aims to strengthen policy awareness and consistency across the organization, while Macarena is focused on enhancing client retention and satisfaction. Each goal reflects a commitment to making an even bigger impact in year two.

 

Connecting to the bigger picture

Every role at Cytel connects to our mission of improving patient lives. Adish summed it up well: “As a Global Senior Recruiter, I help bring in the talent that powers our mission. Every great hire strengthens our culture, drives innovation, and helps the company achieve its goals globally.” Wyatt Gotbetter (Senior Vice President, Global Head Evidence, Value and Access) described the EVA team’s role: “I like to describe the work of EVA as the essential ‘last mile’ in our client’s drug development journey — after decades of scientific discovery, animal and human trials, and regulatory approvals, we play a vital role in helping ensure patients get access to needed therapies.” And Damian reminded us of the impact behind the data: “Every dataset we program and validate helps ensure reliable insights for clinical trials. It’s amazing to know that our work plays a role in bringing life-saving therapies to patients worldwide.”

 

The voices of our newest colleagues remind us that Cytel is more than a workplace. It’s a community driven by purpose, collaboration, and innovation. Here’s to their continued success and to another year of making a difference together.

Beyond the Database: How Clinical Data Management Transforms Patient Care

When we think about clinical data management (CDM), it is often easy to picture databases, spreadsheets, and documents for days. However, being able to step into a clinic setting and witness how data-driven decisions shape patient care reveals the true impact of CDM.

Here, I share real-world examples of the impact of clinical data management on patients and what lies ahead for the field as technology advances.

 

From data to decisions: The impact of clinical data management in the clinical setting

Every piece of data collected during a clinical trial, be it lab results, procedure information, patient reported outcomes, or even adverse events, tells a story. During trials, these individual stories converge to guide treatment plans, ensure safety, and improve outcomes. Accuracy and speed are absolutely critical when it comes to data entry and processing as it allows clinicians to make informed decisions without delay, reducing risks for patients. Without this precision, even groundbreaking therapies can stumble due to incomplete or unreliable information.

 

Real-world examples of CDM impact

Spotting issues early

In an oncology trial, centralized monitoring picked up unusual liver enzyme levels across several patients. Because of that insight, clinicians were able to tweak treatment plans right away, preventing serious side effects and keeping patients safe.

 

Identifying dosing mistakes

During a diabetes study, data checks uncovered inconsistencies in insulin doses. Fixing those errors ensured patients got the right amount of medication, reducing the risk of hypoglycemia and keeping the study on track.

 

Keeping patients engaged

Real-time data review revealed a trend of missed visits in a cardiovascular trial. Sharing this with site teams led to proactive outreach, helping patients stay on schedule and reducing dropout rates.

 

Bridging science and care

Clinical data managers play a behind-the-scenes role, but their work directly influences what happens in the exam room. For example:

 

Keeping data consistent

Consistency ensures that trial results are reliable and can be applied to real-world care, not just on paper.

 

Building trust in the numbers

Data Integrity means clinicians can rely on the information when adjusting dosages or monitoring side effects. No second-guessing, just confidence.

 

Protecting patients and speeding up progress

Regulatory compliance isn’t just about ticking boxes — it keeps patients safe and helps move promising therapies from research to approval faster.

 

Better communication

Real-time data sharing helps patients stay informed about their progress, reducing uncertainty.

 

Fewer repeat visits

Catching errors early means patients avoid unnecessary trips back to the clinic, saving time and stress.

 

The human element — My perspective

As a Principal Clinical Data Manager, I’ve had the privilege of seeing this impact firsthand. One moment that stands out was during a rare disease trial where every day mattered for patients waiting for treatment. By streamlining data cleaning and resolving queries quickly, we helped lock the database ahead of schedule. Knowing that this effort contributed to patients receiving life-changing therapy sooner was incredibly rewarding.

It’s in these moments that the connection between data and human lives becomes crystal clear. Behind every query, every validation check, there’s a patient hoping for better health and that’s what drives our work. CDM is not just about compliance; it’s about compassion through precision.

 

Looking ahead

As technology advances, the integration of real-time data and AI-driven insights will make clinical data management even more impactful. The clinic will become a hub where data flows seamlessly, supporting personalized medicine and improving patient experiences. Predictive analytics could help identify risks before they occur, and automation will free up time for deeper analysis. The future of CDM isn’t just about managing data, it’s about transforming care.

In short, clinical data management isn’t just a technical process, it’s a human story where every detail matters.

 

Interested in learning more?

Micro-Decisions, Macro Impact: Cultivating an Agile Mindset in Every Line of Statistical Code

Statistical programming is a cornerstone of clinical research, converting raw data into the standard datasets, tables, listings, and figures (TLFs) that support decision-making, regulatory submissions, and publications.

Traditional workflows often limit collaboration, adaptability, and early input from programmers. As timelines shrink and expectations grow, it’s clear that a new way of thinking is needed, one that goes beyond efficiency, and into adaptability, collaboration, and value creation.

In clinical statistical programming, agility isn’t only about sprints or ceremonies, it starts with the smallest choices we make at the keyboard.

 

Every day, statistical programmers make hundreds of tiny decisions such as

  • How to name a variable
  • Design a macro
  • Structure a dataset

Most of these choices happen quietly, almost on autopilot. Yet together, they define

  • How flexible our studies are
  • How easily we can adapt to change
  • How smoothly teams can collaborate.

 

These small choices (micro-decisions), multiplied across teams and studies, drive what I call a macro impact.

 

Agility at the code level

Agile thinking refers to building programs with change in mind, favoring adaptability over perfection, and prioritizing clarity and consistency over clever shortcuts. These ideas might sound subtle, but together, they create the difference between rigid code and resilient code.

Programmers can apply agile thinking directly at the code level, through clarity, simplicity, adaptability, and value orientation.

Habits like intentional naming, smart commenting, modular macros, and built-in quality checks make code more resilient and teams more responsive to change.

 

Agility at the code level shows up in many subtle but powerful ways:

  • Intentional naming makes programs self-explanatory and audit ready.
  • Smart commenting tells the why, not just the how.
  • Scalable macros turn adaptability into a default setting.
  • Readable structures make collaboration effortless.
  • Built-in quality checks turn QC from a final gate into a shared rhythm.

 

When practiced consistently, these habits turn teams into systems that learn, adapt, and deliver faster with accuracy and compliance.

 

Thinking differently

This isn’t about doing more; it’s about thinking differently while doing what we already do.

Proven models like Kaizen and Toyota Lean philosophies manifest that, through continuous improvement with a culture of cooperation and eliminating waste, we can deliver maximum value to the customers by sticking to the existing process and not letting go of what we have already learned. Through the lens of these philosophies, we see how small enhancements in daily programming can scale major gains in collaboration, reuse, and efficiency.

 

Final takeaways

Code is communication. Every variable, macro, and comment is a message to a collaborator, a regulator, or your future self.

Let every line you write carry clarity. Let every structure you build invite change. Let every decision reflect agility. That’s the path from micro-decisions to macro impact.

 

Interested in learning more?

Eswara Gunisetti will be at PHUSE EU Connect 2025 to present “Micro-Decisions, Macro Impact: The Role of Agile Thinking in Every Line of Code.” Discover how every line of code can contribute to a more adaptive, transparent, and rewarding way of working, where agility lives not just in our processes, but in our programming decisions themselves.

Register below to book a meeting or visit Booth 9 to connect with our experts:

Career Perspectives: A Conversation with Naydene Slabbert

In this edition of our Career Perspectives series, we are delighted to feature Naydene Slabbert, Principal Clinical Data Manager, at Cytel. Naydene shares insights from her career journey, discusses the critical role of early-stage clinical trial setup in ensuring the delivery of high-quality, actionable data, and reflects on the evolving role of data managers in clinical trials.

 

Can you give us a little background on your career so far? What led you to clinical data management, and how has your path evolved over the years?

My journey in clinical data management started over 23 years ago, and it’s been such a rewarding experience filled with growth, learning, and a lot of exciting challenges. I began my career at Quintiles (now IQVIA), where I started as an Assistant Data Coordinator and eventually became a Data Team Lead. Those early years gave me a solid foundation in clinical trial operations and sparked my interest in data quality and process improvement.

In 2021, I moved to DF/Net Research, where I led several high-profile studies and contributed to infrastructure and software development. That role helped me expand my technical and strategic skills, especially in managing complex, multi-site trials.

Now, I’m proud to be part of Cytel as a Principal Clinical Data Manager. My focus is on enhancing end-to-end data management processes, working closely with cross-functional teams, and making sure our systems support both scientific excellence and regulatory success. Over the years, my role has evolved from hands-on data work to strategic leadership, and I continue to be inspired by the impact that well-managed clinical data can have on public health and patient outcomes.

 

You’ve been supporting the lead on a major study that went live in September. What did your day-to-day work look like at this stage of the project?

During the go-live phase of the study I’m working on, my daily focus was to make sure our data management systems and processes were running smoothly and in sync across teams. It’s a crucial time where accuracy, quick thinking, and strong teamwork really matter.

I partnered closely with the study lead and various cross-functional teams to validate the Electronic Data Capture (EDC) system, double-checking that all edit checks and Case Report Forms (CRFs) were working as expected. We held daily huddles and status meetings to keep everyone aligned and moving forward, which made it easier to spot and tackle any issues early on.

This stage demanded a lot of agility, collaboration, and attention to detail — all with the goal of setting the study up for long-term success.

 

From preparing documents to getting the database ready for data collection — how do these early tasks set the foundation for a successful study?

The early stages of a clinical study really lay the groundwork for everything that follows. It’s where we take the scientific goals outlined in the protocol and turn them into practical, workable data processes. Getting this part right is key to the trial’s overall success.

A big part of this involves preparing core documents like the Data Management Plan, validation guidelines, and Standard Operating Procedures (SOPs). These aren’t just paperwork — they’re the playbook that keeps everyone aligned on exactly how data will be collected, reviewed, and reported. They help ensure consistency, compliance, and quality from start to finish.

At the same time, building and testing the database, from CRF design to edit checks and system integrations, is just as critical. This is where we make sure the tools for capturing data are user-friendly, accurate, and fully aligned with the protocol. A well-designed database helps reduce errors, speeds up query resolution, and supports faster decision-making.

By putting in time and care upfront, we’re able to minimize potential risks, boost efficiency, and set the stage for a study that’s not only regulatory-ready but also delivers high-quality, actionable data. In my experience, a strong launch phase really sets the tone for everything that follows.

 

Now the study has gone live, you’re overseeing the data. What does that oversight involve, and how do you ensure data quality and consistency throughout the trial?

Once a study goes live, my role shifts into a proactive oversight phase where the focus is on maintaining data integrity, consistency, and compliance across all participating sites and systems.

Ultimately, my goal is to create a system of continuous quality assurance. By fostering strong communication, leveraging technology for real-time insights, and maintaining rigorous documentation, I help ensure that the data collected is accurate, timely, and meaningful. This supports both scientific outcomes and regulatory success, and ultimately, the patients.

 

What do you like best about your role, and about working at Cytel?

What I enjoy most about my role is the opportunity to lead complex studies that have real-world impact, while collaborating with talented teams across disciplines. I thrive on problem-solving and ensuring data quality from start to finish, and I appreciate the autonomy and trust I’m given to manage projects effectively.

As for Cytel, I value the supportive culture and global collaboration. The company encourages continuous learning and innovation, and I’ve found the environment to be both respectful and intellectually stimulating. It’s rewarding to be part of an organization that’s committed to advancing clinical research through data-driven solutions.

 

Is there a particular project or initiative you’ve worked on recently that you’re especially proud of?

One project I’m especially proud of is the trial I mentioned earlier, which went live recently. It’s a high-profile study with complex data requirements, and I’ve been deeply involved from the early planning stages through to go-live. I helped translate the protocol into robust data collection tools, oversaw database setup and testing, and now manage ongoing data oversight. What makes this project stand out is the level of collaboration and precision required. It’s been incredibly rewarding to see our preparation pay off in a smooth launch!

 

You’ve held leadership roles across several organizations. What’s one piece of career advice you wish you had received earlier?

If I could go back and give myself one piece of advice early in my career, it would be: “Don’t shy away from getting your hands dirty.” I used to think leadership was mostly about strategy and oversight, but some of the most valuable lessons, and the biggest impacts made, came from jumping into the details.

Whether it’s troubleshooting a tricky data issue, reviewing CRFs, or helping build out a database, being hands-on keeps you sharp and connected to the work. It also builds trust with your team. They can see you’re not just directing from the sidelines, but genuinely in it with them. That kind of involvement helps you lead with more empathy, insight, and credibility.

 

How has your approach to managing clinical data changed over time, especially as you’ve moved into more strategic roles?

Over time, my approach to managing clinical data has shifted from task execution to strategic oversight. Early in my career, I focused on operational details such as CRF design, data cleaning, and query resolution. As I moved into leadership roles, I began shaping data strategies, aligning them with protocol goals, regulatory requirements, and sponsor expectations. I now prioritize proactive planning, cross-functional collaboration, and system optimization to ensure data quality and efficiency across the entire study lifecycle.

Now, my approach is focused on seeing the bigger picture and guiding teams toward smarter, scalable solutions.

 

Clinical trials can be complex, especially when managing data across different regions and systems. What are some of the biggest challenges you’ve faced in data management, and how did you tackle them?

One of the biggest challenges in clinical data management is keeping data consistent and reliable across multiple regions, especially in large, global studies. Each site often has its own workflows, varying levels of experience, and different infrastructure, which can lead to inconsistencies in how data is captured and handled.

To manage this, I focus on creating clear, well-structured documentation and providing centralized training to ensure everyone is on the same page. I also put strong validation processes in place to catch issues early. Working closely with vendors and site teams is key — it allows us to resolve problems in real time and keep the data aligned across systems.

Strategic planning and open communication play a big role too. By staying connected with all stakeholders and anticipating potential challenges, we’re able to maintain high-quality, harmonized data throughout the trial. It’s all about building trust, being proactive, and keeping the bigger picture in mind.

 

The field is evolving quickly. How do you see the role of data managers changing with the rise of AI, machine learning, and decentralized trials?

The role of data managers is indeed evolving rapidly with the rise of AI, machine learning, and decentralized trials. We’re moving from purely operational roles to more strategic ones, where we not only manage data but also help shape how it’s collected, interpreted, and used.

One trend I’m particularly excited about is the integration of AI and machine learning into data cleaning and query management. These tools help us move from reactive to proactive data oversight, identifying patterns and anomalies much earlier in the process. Decentralized trials are also reshaping how we collect and manage data — requiring more flexible systems and real-time validation strategies. As a data manager, I now focus more on system integration, data governance, and ensuring that new technologies align with regulatory standards and study goals.

These innovations are pushing us to become more strategic, tech-savvy, and collaborative, which I find both challenging and energizing. It’s an exciting shift that requires both adaptability and a strong foundation in data quality principles.

 

What skills do you think will be essential for future data managers entering the field?

I think future professionals in this space will need a mix of technical know-how, strategic thinking, and flexibility to really thrive.

For starters, being comfortable with data, understanding how to interpret it, analyze it, and use tools that support automation and predictive insights, is going to be key. With AI, machine learning, and real-time data becoming more common, data managers will need to be confident working with more complex systems and datasets.

Technical skills will always be important. You’ll still need to work with EDC platforms, understand coding and data standards, and know how to manage data integrations. But we’re also seeing a growing need to understand APIs, interoperability, and data governance, especially as decentralized trials become more widespread.

Just as important are the soft skills. Strong communication, collaboration, and leadership are essential because data managers often act as the link between clinical, statistical, and operational teams. Being able to bring people together and keep everyone aligned makes a huge difference.

And finally, I’d say curiosity and a willingness to keep learning are vital. The field is changing fast, and those who stay open to new ideas and keep building their skills will be best positioned to lead the way.

 

As a remote employee, how do you maintain a healthy work-life balance? What strategies work for you, and do you feel supported by Cytel in this regard?

Working remotely definitely has its perks, but maintaining a healthy work-life balance takes a bit of intention. For me, it starts with having a clear plan for the day. I like to set goals, block out time for focused work, and make sure I take regular breaks. I also try to stick to a consistent “log-off” time, which helps me mentally switch from work mode to personal time.

One thing that’s really helped is having a dedicated workspace that’s separate from my living space. It makes it easier to stay focused during the day and disconnect in the evenings. I also make time for walks, family, and activities that help me recharge as those are just as important as meetings and deadlines.

Cytel has been incredibly supportive when it comes to flexibility and balance. There’s a lot of trust and autonomy, and the culture really respects personal time. Leadership encourages us to take care of ourselves, which makes remote work not only manageable but genuinely enjoyable.

 

You have been with Cytel for around 6 months now. What aspects of Cytel’s culture stood out to you when you joined?

What really stood out for me when I joined Cytel was how collaborative and welcoming the culture is. From day one, I felt like part of a team. People are generous with their time, open to new ideas, and genuinely invested in working together to achieve shared goals. It’s not just about getting the job done; it’s about how we support each other along the way.

I also really appreciate the company’s focus on quality and innovation. There’s a strong drive for continuous improvement, and strategic thinking is encouraged. That’s something I value deeply in my own work, especially when it comes to refining processes and contributing to cross-functional initiatives.

Another thing that impressed me is how well remote employees are supported. Even though I’m based in South Africa, I’ve felt fully connected to the global team. Communication is seamless, and there’s a real effort to make sure remote staff feel included and empowered.

Overall, Cytel fosters a culture that supports both professional growth and personal well-being, and that’s something I truly appreciate.

 

Finally, what are your main interests outside of work? What helps you recharge and stay inspired?

When I’m not working, you’ll probably find me out in the beautiful South African bushveld, book in hand, or enjoying coffee in the sun — my personal reset button. I love getting creative in the kitchen (even if some dinners end up as “learning experiences”) and tackling home improvement projects just for the fun of it.

I’m also a mom to teenagers, which means my life is a mix of deep chats, dramatic eye rolls, and trying to keep up with slang that changes weekly. They keep me laughing, grounded, and constantly on my toes.

Spending time with family and friends is what really recharges me. It’s the fuel that keeps everything else running smoothly.

Thank you, Naydene, for sharing your experience with us!

Naydene Slabbert

Moving to Agile: A New Approach to Statistical Programming

Prior to recent advances, traditional software development processes have been characterized by rigid methods that required teams to follow pre-defined processes. However, the advent of Agile programming revolutionized traditional development processes by shifting the focus to flexibility, collaboration, and continuous improvement. Unlike traditional methods, Agile embraces change and enables teams to respond quickly to new requirements.

Now, the Agile approach has moved from software development into statistical programming, allowing teams to work in small increments rather than following a linear, pre-planned process. Instead of extensive upfront planning, Agile encourages adaptability and frequent reassessment of project goals.

Here, I discuss Agile methodologies, the benefits and challenges, and invite readers to learn more with our new case study on implementing Agile and Scrum for SAS programming in clinical development.

 

What is Agile programming?

Agile is an iterative project management and development approach that prioritizes flexibility, collaboration, and responsiveness to change. Though originally developed for software engineering, Agile has since gained widespread adoption across various industries, including healthcare and clinical research.

At the heart of Agile is the concept of breaking down complex projects into smaller, manageable units of work, called “sprints,” typically lasting one to four weeks. At the end of each sprint, the team delivers a functional product increment, ensuring continuous feedback and the ability to adjust course as needed.

Key tenets of Agile in statistical programming include:

  • Prioritizing individuals and interactions over processes and tools to foster teamwork and effective communication.
  • Prioritizing customer collaboration over contract negotiation to involve stakeholders throughout the process.
  • Prioritizing responding to change over following a plan to support remaining flexible to evolving needs.

These tenets support incremental delivery of outputs, frequent feedback loops to all programmers, and overall team collaboration.

 

Benefits of Agile programming

Agile methodologies offer numerous additional advantages, making them a preferred choice for modern development teams:

 

Faster delivery times

Agile focuses on small, manageable iterations (sprints), allowing teams to release interim deliverables frequently rather than waiting for the entire product to be complete.

 

Higher customer satisfaction

Continuous delivery and ongoing stakeholder involvement ensure products align with user needs, leading to better adoption and positive feedback.

 

Reduced risk of project failure

By regularly assessing project goals, teams can detect potential issues early and make adjustments before they become costly problems.

 

Agile methodologies

Agile methodologies come in different flavors, each tailored to unique team dynamics and project needs.

 

Scrum

Scrum is one of the most widely used Agile frameworks. It divides development into short cycles called sprints (typically 2 weeks), during which teams work on prioritized tasks. Scrum incorporates daily stand-up meetings and reviews to track progress and remove obstacles.

 

Kanban

Kanban is a visual workflow management system that emphasizes continuous delivery. Teams use a Kanban board to track tasks in various stages (To-Do, In Progress, Completed), ensuring transparency and limiting work in progress to prevent bottlenecks.

 

XP

XP focuses on high-quality development practices like test-driven development (TDD) and continuous integration (CI). It encourages pair programming and frequent code reviews to enhance software quality.

 

Challenges to adopting Agile

While Agile offers many benefits, teams may face challenges when adopting Agile practices. Rapid development cycles can lead to frequent scope changes, making it hard to maintain focus. This can be avoided by clearly defining priorities and using backlog refinement sessions to keep scope manageable.

Additionally, Agile relies heavily on collaboration, but without proper communication, misunderstandings can arise. Strategies for preventing this include encouraging daily stand-ups, using standard project management tools, and fostering a culture in which open commentary is encouraged.

Finally, transitioning to Agile can be difficult, especially in organizations accustomed to traditional methods. But a gradual approach to this new methodology is warranted: provide Agile training, start with pilot projects, and celebrate early wins to build confidence.

 

Final takeaways

Agile programming is more than just a methodology — it’s a mindset that promotes adaptability, efficiency, and collaboration. By embracing Agile, teams can deliver high-quality software faster while continuously improving their processes. Whether you’re a startup or an enterprise, adopting Agile can lead to better productivity and customer satisfaction.

 

Interested in learning more?

Download our new white paper that provides a detailed case study on implementing Agile and Scrum for SAS programming in clinical development.

Offshoring Biometrics FSP Teams: Best Practices

Functional Service Provider (FSP) models are widely used to deliver biometrics services in the biopharmaceutical industry. Traditionally, these teams have been based in the United States and Western Europe, but with a globally recognized talent pool and the need to deliver more value within confined budgets, sponsors are now interested in offshore locations, such as India, South Africa, and Eastern Europe.

Here, I detail best practices for sponsors looking to incorporate offshore FSP teams.

 

Best practices for building offshore FSP teams

Best practices for building offshore teams from scratch include:

 

1. Developing a detailed recruitment plan

Creating a comprehensive recruitment plan that outlines timelines and mutually agreed-upon milestones is key to effectively launching offshore teams. The recruitment plan should be viewed as a living document that is reviewed regularly and updated as needed. The focus should be on “planned vs actual” metrics and ensuring that all roadblocks to recruitment are removed in a timely manner.Finally, this document must be based on hard data, acknowledging where the talent pool is and the track record in recruiting that population.

 

2. Focusing on early risk identification and mitigation

Obstacles to recruitment will occur, and anticipating and planning for these challenges early on will do much to support recruitment success. Common risks to recruitment include lengthy country-specific notice periods, changing economic conditions, and competition from other vendors and sponsors. The FSP vendor should have active plans to implement mitigation strategies to minimize any impacts due to these risks.

 

3. Identifying quality resources that fit the sponsor’s culture

Detailed and complete job descriptions are central to recruitment success, but beyond the pure technical skills, FSP recruitment must incorporate an assessment of overall “fit” within the sponsor’s organization. For example, does the role require working as part of a team or is an individual performer more likely to find success? All of this should be supported by a dedicated global talent acquisition team that understands where to find talent to increase the probability of recruitment success.

 

4. Accelerating onboarding

Strong onboarding is highly correlated to employee retention; it must be timely, practical, and clear. Ideally, new FSP hires should start one week prior to their first day with the sponsor, to allow time to complete internal training at the FSP provider, and to understand the sponsor’s expectations before starting from other team members or the FSP Lead. Finally, pairing up new hires with an already established “buddy” to which they can seek day to day advice on the role contributes greatly to new employee satisfaction.

 

5. Prioritizing retention

Tenets of effective retention planning start with a positive and seamless onboarding experience and progress to garnering employee feedback and establishing a continuous feedback loop. Other strategies include employee recognitions and rewards and offering creative professional development opportunities. Additionally, while salary and bonus are indeed important to employees, these should be supplemented with other important benefits, such as flexible work hours, to demonstrate employee value.

 

Considerations for sponsors

Many sponsors with already established onshore FSP teams are interested in offshoring options, essentially replacing these resources with resources in more cost-effective countries. In these cases, business continuity is the utmost priority and transition timings must work around the needs of the business and required portfolio deliverables. This requires a fair amount of upfront planning with the sponsor, based on the following questions:

  • Assuming that timeline slippage is not permissible and that all key deliverables are of equal priority, what are the key deliverables due this calendar year, mapped out by month?
  • Which FSP personnel to be transitioned are involved in these deliverables?
  • Within each FSP personnel assigned to a key deliverable, which are most critical (transitioned later), and which are less critical (transitioned earlier)?
  • For replacement headcount, what geographies are preferred (if any)?
  • What are the notice periods for these preferred geographies?
  • Finally, how do we reconcile the time required to transition off with the time required to transition on, while minimizing any work process disruptions?

This is an iterative process that requires close collaboration with the sponsor.

Winning in a Budget-Constrained World: Smarter Clinical Trial Optimization

Clinical trials have become more complex and costly in recent years, driven by expanding data requirements, global regulatory demands, and increasingly specialized therapies. For sponsors and CROs, balancing quality with cost efficiency is more challenging than ever, especially when trying to streamline biometric data management across diverse geographies.

However, several proven strategies are helping organizations optimize clinical trial budgets without sacrificing quality or compliance. From flexible resourcing models to cutting-edge technology, industry leaders are rethinking traditional approaches and adopting scalable solutions to meet today’s demands.

 

Current trends in clinical trial cost optimization

One of the key trends in FSP biometrics is the move toward more flexible, modular engagements that allow sponsors to optimize costs while maintaining access to specialized expertise. Rather than relying on large, fixed teams, organizations are increasingly leveraging scalable FSP models to allocate resources dynamically across data management, statistical programming, and biostatistics functions based on project phases and workload intensity. This flexibility is especially valuable during high-demand periods like database lock or interim analyses, where rapid scaling is needed without long-term overhead. Additionally, sponsors are integrating global delivery models within FSP partnerships, tapping into talent pools from cost-effective regions while ensuring alignment with quality standards. The growing use of technology-driven efficiencies, such as automated data checks and AI-supported programming workflows within FSP teams, is further driving down costs and improving operational agility.

 

Building specialized skill sets to improve efficiency and quality

As clinical trials grow more specialized, access to niche expertise has become a critical factor in maintaining quality and managing costs. Building internal capability through focused training programs allows data management, biostatistics, and statistical programming teams to stay current with the latest methodologies and regulatory requirements.

Skilled teams reduce rework, prevent costly errors, and improve turnaround times — all of which contribute directly to budget optimization. In addition, companies that invest in continuous learning foster a culture of quality and innovation, setting themselves apart in a highly competitive market.

 

Global Capability Centers (GCCs): Unlocking scalability and cost savings

Global Capability Centers (GCCs) have emerged as a strategic asset for clinical trial sponsors. Located in cost-effective regions but equipped with world-class talent and infrastructure, GCCs allow organizations to scale their operations efficiently while maintaining control over quality and timelines.

By leveraging GCCs for biometric functions — including data management, programming, and biostatistics — companies can optimize labor costs without sacrificing expertise. Additionally, operating in multiple time zones supports 24/7 workflows, helping to accelerate study timelines and manage global studies more effectively.

 

Innovative resourcing models: FSP and just-in-time staffing

Traditional full-service outsourcing models are being supplemented — and sometimes replaced — by more agile FSP arrangements. With FSP models, sponsors retain greater control over trial oversight while benefiting from specialized services and flexible resource deployment.

Just-in-time staffing is another innovative approach that is gaining traction. This model enables organizations to quickly onboard qualified professionals only when their expertise is needed, reducing idle time and controlling personnel costs. Both models help sponsors stay nimble in response to shifting trial demands while protecting budgets.

 

Emerging markets and government incentives

Many emerging markets are becoming attractive hubs for clinical trial operations thanks to favorable government policies, tax incentives, and infrastructure investments. Countries across Asia, Eastern Europe, and Latin America are building sophisticated clinical research ecosystems that offer significant cost advantages.

By expanding into these regions, sponsors gain access to large, diverse patient populations and skilled professionals, creating opportunities for faster enrollment and cost-efficient trial execution.

 

Integrating AI and technology to streamline processes

Artificial Intelligence (AI) and machine learning are revolutionizing how clinical trial data is managed. From automating data cleaning to predictive analytics that identify risks earlier, AI-driven tools help reduce manual effort, minimize errors, and speed up decision-making.

Smart technology adoption also enhances resource allocation, allowing biometric teams to focus on high-value tasks while repetitive work is automated. This balance leads to better quality data, faster insights, and meaningful cost reductions across the trial lifecycle.

 

Shaping the future of cost-efficient clinical trials

Rising clinical trial costs are a reality — but they don’t have to derail your development pipeline. By embracing scalable solutions, investing in talent, and exploring emerging technologies, organizations can navigate today’s challenges while safeguarding both budgets and data quality.

Agile leadership, a global mindset, and a willingness to innovate will be key to succeeding in this new landscape. Whether it’s tapping into global capability centers, leveraging just-in-time staffing, or integrating AI tools, the path to more efficient and effective clinical trials is within reach.

Staying informed and adaptable ensures your clinical development strategies remain competitive, cost-effective, and ready for the future of healthcare innovation.

 

Interested in learning more? Watch our on-demand webinar, “Winning in a Budget-Constrained World: Smarter Clinical Trial Optimization”: