WRF (v4.4) MPAS-A (v7.3) CPAS (v1.0) 
Obtaining/defining regional high-resolution grid Follow WRF’s best practices to define nested domains. Download available standard meshes from official site. Specify your regions and corresponding target resolutions via cloud-based service platform.
Shape of region
Rectangular
Circular or elliptical (in all available meshes.)
Arbitrarily shaped
Resolution variability

 x3 resolution refinement in nesting.

 

Example: 27km - 9km - 3km - 1km

x4 -x20 resolution refinement in practice.

 

Example: 60km - 3km mesh

Arbitrary and be extremely large due to hierarchical time-stepping, can be as extreme as x500.

 

Example: 100km - 200m

Automatic resolution boost for orography nil nil Supported!
Automatic resolution boost for coastline nil nil Supported!
Population of geographic data

Support different resolutions as specified in the WPS namelist file.

 

Example: topo_10m, topo_gmted2010_5m, topo_2m, topo_gmted2020_30s

Support a single resolution, disregarding cell size variation.

 

Example: topo_30, topo_gmted2020_30s

Support multiple resolutions to match cell sizes. Users can let CPAS automatically choose the resolution or specify manually.

 

Example: topo_10m, topo_5m, topo_2m, topo_30s, topo_15s, topo_3s, 

Analyzing simulation error in resolution jump / transition N/A N/A Shallow water wave test and plots of error can be generated by the platform(Experimental)
  WRF (v4.4) MPAS-A (v7.3) CPAS (v1.0) 
Model type Limited-area model, targeting for high-resolution simulation within the domain. Global model with moderate regional resolution refinement.  Global model with arbitrary resolution refinement, including extreme resolution refinement for small region.
Forecast accuracy and time horizon Very accurate within the high-resolution region up to about 3 days. Generally accurate for medium-range time horizon up to about 9 days. Very accurate within the high-resolution region and generally accurate for medium-range time horizon, up to about 9 days with an appropriately designed mesh.
Time stepping Each domain uses its own time step size specified in WRF namelist.

All cells in the globe uses the same time step size disregarding cell size variation.

 

(This time step size is limited by the smallest cell in the entire mesh. This is required to meet the stability condition for time integration. The presence of a small cell in the mesh makes the computational resource requirement prohibitively large. )

Mesh cells of different sizes use different timesteps.

 

(This not only reduces the required computational resources, but also enables the use of extreme variable resolution on a global mesh.)

Hierarchical time-stepping (HTS) report N/A N/A Available after mesh generation, showing regions of time stepping levels and resource utilization.
Parallelization and load balancing The computation of each domain is parallelized while the execution sequence of nested domains is serial. The mesh of the whole globe is fully partitioned and the parallelization is excellent. The mesh of the whole globe is fully partitioned with consideration of time-stepping.
Scalability Scalability limited by the domain with smallest number of grid cells. Extremely scalable. Very scalable.
Scale-awareness of physics model Different domains may use different physics models. No scale-awareness in convection scheme except Grell-Freitas scheme. Automatically switch off cumulus parameterization for small cells.
Handling wave reflection in resolution jump / transition. Handled by specified zone and relaxation zone near lateral boundaries for the resolution jump. Handled by filtering, with default / specified coefficients for horizontal diffusion, for gradual resolution transition. The mesh generation algorithm creates customized mesh with smooth resolution transition that can be handled by filtering with default coefficients for horizontal diffusion.
Model inconsistency between regional and the global driving model Different parameterization in physics models, especially those related to moisture, between WRF and the driving global model results in artifacts and spurious effects propagating from the lateral boundaries into the region of interest. No such problem. No such problem.
Four-dimensional data assimilation  Supported  nil  Supported, with scaled FDDA option for variable-resolution mesh.
CPAS is the first cloud-based service for generating customized MPAS-A meshes. Our service does not require the installation of any software, or scripting / programming to generate MPAS-A mesh. You can simply specify the mesh with an internet browser. We help you execute, optimize and maintain.
CPAS supports "domains" (hereinafter called regions) of arbitrary shape, arbitrary size and arbitrary resolution designation. In WRF domain nesting, a map projection needs to be chosen beforehand, and all domains are restricted to be rectangular. Resolution change is restricted to x3 or x5 jump across domains, and the number of cells in each domain manually determined for efficient parallel execution. In CPAS, for your desired forecast horizon (e.g. 120-hours), you determine the transition zone and the resolution of the outer region.
The Shin-Hong PBL parameterization scheme is ported from WRF and available on the CPAS platform. For ~1km - 200m “grey zone” resolution, the Shin-Hong scheme tackles grid-spacing dependency in turbulence parameterization.
The program for preparing static and initial conditions has been customized to utilize geographical data of multiple resolutions to handle extreme resolution variations. A new 3-arc second dataset with terrain, landuse and soil has been incorporated to handle grid cells that have resolutions as fine as a few hundreds of meters.
CPAS supports the use of four-dimensional data assimilation (FDDA), a technique to nudge the model simulation towards a reference atmospheric state. If you want to perform retrospective simulation using CPAS, or create a forecast that does not deviate much from a reference forecast data, this FDDA option may come in handy.
Not only does CPAS support global simulation, but it also supports limited-area simulation, for which the boundary can be drawn in arbitrary shape. You can even specify multiple boundaries on the globe to perform multiple limited-area simulations in a single model run.
Orography and coastline resolution is important for accurate weather forecast. CPAS provides options for enhancing resolution in these areas, with CPAS automatically increasing resolution for mountains and land-water boundaries.
Partitioning of the mesh is customized to optimize hierarchical time-stepping. In WRF’s parallelization algorithm, parallel computing is restricted within each domain and nested domains are computed sequentially. The WRF domain with the smallest number of cells limits scalability. MPAS-A parallelize computation by partitioning the global mesh, excellent scalability.
Model physics are also customized to be scale-aware. Parameterization for convection is disabled for small cells (i.e. for convection-resolving scale).
In contrast to WRF, CPAS doesn’t require lateral boundary conditions. As for dynamical downscaling using WRF, the synoptic-scale flows in the regional model are usually constrained as its lateral boundary conditions are forced by the global model. Moreover, the regional model suffers from the difference between global and regional model physics. Therefore, CPAS allows more freedom for the initial state to evolve which will greatly improve accuracy. Finally, unlike WRF, CPAS doesn’t require downloaded global forecast system (GFS) forecast data.
CPAS automatically assigns the optimal time steps according to resolution variation. You are free to design variable-resolution mesh configurations in a manner that is infeasible in the official MPAS-A release. Hierarchical time-stepping allows a variable-resolution global model like a nested-domain multi-resolution regional model. Computation is concentrated in regions of interest.
No need to install any software. Computational experiments are done in the cloud. CPAS also provides an easy-to-use visualization system to inspect computational results in the cloud.