Refactoring Monolithic Razor Pages: Adopting a Pipeline Pattern for Enhanced Maintainability
Refactoring monolithic code in Razor Pages can transform tangled logic into modular, testable components. This post explores refactoring stats computations from a bloated page handler to a dedicated service using a pipeline pattern, boosting separation of concerns, testability, and readability.
Identifying the Monolithic Problem
Monolithic code crams everything into one method, like OnGetAsync in a Razor Page. Symptoms include:
- Long methods with data loading, calculations, and saving.
- Mixed concerns: querying, processing, stats computation.
- Hard to test: Dependencies on DbContext make isolation tough.
- Poor readability: Dense logic obscures intent.
Example from old UpdateStats.cshtml.cs: OnGetAsync fetches products, imports, inventories, line items; loops over terms; computes stats per product with inline formulas for mean, median, skewness, etc.; handles errors; saves to DB.
This example violates Single Responsibility Principle, inflating maintenance costs.
The Pipeline Pattern Solution
Pipeline pattern chains processors, each handling one concern. Define IStatCalculator interface with Calculate method. Implement a calculator for each stat (e.g., BasicStatsCalculator, SkewnessCalculator, KurtosisCalculator).
ProductStatsService orchestrates the IStatCalculator pipeline; its ComputeStats method iterates terms/products, builds StatsContext, and runs calculators on ProductStat.
Step-by-Step Refactoring
-
Extract Service: Create ProductStatsService.cs with ComputeStats. Move data loading here.
-
Define Pipeline: Add IStatCalculator implementations for stats like basic (min/max/mean), median, skewness.
-
Refactor Handler: In new UpdateStats.cshtml.cs, OnGetAsync queues background task: Load data minimally, call service.ComputeStats, save stats. (The old version is here: UpdateStats.cshtml.cs)
-
Context Object: Use StatsContext to pass shared data (products, imports, etc.) through the pipeline to each calculator.
Code Examples
Before (Monolithic in Page Handler):
Complete listing: UpdateStats.cshtml.cs
public async Task<IActionResult> OnGetAsync()
{
// Clear stats
await dbContext.Database.ExecuteSqlRawAsync("DELETE FROM ProductStats");
// Loop over terms
foreach (int t in new int[] { 0, 90, 30 })
{
// Fetch and filter imports
var imports = await dbContext.ProductInventoryImports
.Where(i => t == 0 || i.ImportDateTime >= DateTime.UtcNow.AddDays(-t))
.ToListAsync();
// Compute totals, units, then stats like mean, median, skewness inline
// ... (hundreds of lines of calculations per product)
}
await dbContext.SaveChangesAsync();
}
After (Service with Pipeline):
Complete listing: ProductStatsService.cs.
// In ProductStatsService.cs
public IEnumerable<ProductStat> ComputeStats(/* params */)
{
var statsList = new List<ProductStat>();
foreach (var term in terms)
{
// Filter data
var filteredImports = /* ... */;
foreach (var product in products)
{
// Build context
var context = new StatsContext { /* ... */ };
var stat = new ProductStat { /* ... */ };
// Run pipeline
foreach (var calculator in _pipeline)
{
calculator.Calculate(context, stat);
}
statsList.Add(stat);
}
}
return statsList;
}
Page Handler Simplified:
Complete listing: UpdateStats.cshtml.cs
public async Task<IActionResult> OnGetAsync()
{
_queue.QueueBackgroundWorkItem(async token =>
{
// Load minimal data
var products = /* ... */;
var imports = /* ... */;
// ...
var statsService = new ProductStatsService();
var stats = statsService.ComputeStats(/* ... */);
// Save stats
});
return RedirectToPage(/* ... */);
}
Benefits
- Separation of Concerns: Each calculator focuses on one stat; service orchestrates the pipeline.
- Testability: Mock StatsContext for unit tests on calculators; isolate from DB.
- Readability: Shorter methods; logic modularized; easier to follow pipeline flow.
- Extensibility: Add/remove calculators without touching core logic.
- Performance: Potential for parallelization or optimization per step.
Shifting to a pipeline pattern in a service declutters Razor Pages, aligning with best practices. This reduces bugs, eases maintenance, and scales better—key for robust applications. Apply it to simplify complex computations.