/
ECMWF ECMWF

ECMWF "uncoupled" coupled model: Coupling using subroutine - PowerPoint Presentation

beastialitybiker
beastialitybiker . @beastialitybiker
Follow
348 views
Uploaded On 2020-06-17

ECMWF "uncoupled" coupled model: Coupling using subroutine - PPT Presentation

calls CW2015 Kristian S Mogensen Marine Prediction Section kmogensenecmwfint Overview of talk Baseline The focus of this talk is going to be on coupling of IFS and WAM to NEMO Motivation ID: 780769

model coupled nemo amp coupled model amp nemo coupling call atmosphere ocean pinfo ifs system nemogcmcoup forecasting fields mpi

Share:

Link:

Embed:

Download Presentation from below link

Download The PPT/PDF document "ECMWF "uncoupled" coupled model: Couplin..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

ECMWF "uncoupled" coupled model: Coupling using subroutine calls

CW2015

Kristian S. Mogensen

Marine Prediction Section

k.mogensen@ecmwf.int

Slide2

Overview of talk:

Baseline:

The focus of this talk is going to be on coupling of IFS and WAM to NEMOMotivation:

Why do we need to couple the ocean to the atmosphere at ECMWF in the first place?A brief history of coupled model setup at ECMWF

Technologies used for coupling in the past

Coupling requirement for NWP

versus climate modellingThe current coupled model based on IFS-WAM-NEMO:How does it work and especially how do we glue the IFS+WAM together with the NEMO modelScalability of the coupled modelConclusions and outlook

2

Slide3

Why do we need an ocean model for NWP?

The

ocean is the lower boundary for atmosphere for a large part of the earthAccurately modelling this lower boundary should give feedback to the atmosphere

For long range predictions like seasonal forecasting and monthly forecasting this is very importantENSO

For medium range forecasting the ocean state (including sea-ice) can change on a daily time scale

Ice modelling

Hurricane feedback to/from the oceanToday we use a coupled model for the following systems:The ensemble prediction system (ENS) from day 0 to 15 and the monthly extension to day 32 twice per weekThe seasonal forecasting system3

Slide4

Hurricane modelling with our coupled model.

4

Slide5

A short history of coupled modelling at ECMWF

IFS

(atmosphere) coupled to

HOPE (ocean) from around 1997.

OASIS2 based on coupling via files/pipes

Seasonal: System 1 to 3

Monthly forecastingOriginally separate systemBecame part of VarEPS (now ENS) coupled in leg B (day 10 to 15) and leg C (beyond day 15) in March 2008IFS coupled to NEMO (ocean) from November

2011

OASIS3 based on coupled via MPI using MPMD execution

Seasonal system 4 +

VarEPS

/ENS

Still used for seasonal forecasting.

IFS coupled to NEMO in a single executable

from

November 2013 in

ENS.No external couplerCoupling between the WAM component and NEMOENS system coupled in all 3 legs (coupling from day 0)

5

Slide6

Coupling requirements for NWP versus climate modelling

Predictions on a short-range to seasonal forecasting is to a large extent an initial condition problemYou can’t make accurate predictions unless you know your initial state

We need data assimilation systems for atmosphere, land, ocean, wave …Data assimilation systems can depend on the model formulationE.g. you need the tangent linear and adjoints

models for a 4D-VAR systemA large amount of human resources goes into developing the DA systemsWhere we use external models (like the NEMO model for the ocean) we stick with one model for many years

A consequence is that we don’t care too much about being very flexible with choices of models.

A fit for purpose solution is more important

Coupling for in all forecasting systems is a priority for ECMWF for the future.6

Slide7

Current coupled system:

Design principle: the atmosphere and wave models don’t know anything about the inner workings of NEMO

There is a

coupling layer, which have

access to all NEMO F90 modules

(e.g

. data), but only accepts data from IFS/WAM as argumentsCoupling fields to/from NEMO is passed as subroutine arguments to this layer in IFS/WAMGrid information from IFS/WAM needs to passed as arguments as wellAll regridding is done within the interface layer:Interpolation weights are computed outside the model and read from a fileThe interpolation is done in parallel with minimum source field information communicated to the individual MPI tasks

If destination points

a

and

b

on task

N

both needing source point

c

from task

M then it is only sent to task N onceCoupling is sequential with the following calling sequence: Atmosphere → Waves → OceanTime stepping in the atmosphere model (IFS)

drives the coupling

7

Slide8

Schematic overview of coupled

system:

8

The LIM2 model needs additional fields from the IFS

Slide9

Coupled from the point of view of the atmosphere:

9

IF

(LNEMOCOUP) CALL ININEMO

DO JSTEP=NSTAR2,ISTOP

<<BORING ATMOSPHERE STUFF>> IF (LLWVTIME) THEN CALL WVCOUPLE(TSTEP,NSTPW,LLSTOP,LLWRRW)

ENDIF

IF(NFRCO /= 0 .AND. LNEMOCOUP .AND. LMCC04) THEN

IF(MOD(NSTEP,NFRCO

) == 0) THEN

IF (LWCOU) THEN

CALL UPDNEMOFIELDS(LWSTOKES)

CALL UPDNEMOSTRESS

ENDIF

CALL COUPLNEMO(NSTEP)

ENDIF

ENDIF

ENDDO

Slide10

Coupling initialization (below ininemo)

In nemogcmcoup_coupinit.F90

:CALL

scripremap_read(

cdfile_gauss_to_T,remap_gauss_to_T

)

CALL parinter_init( mype, npes, icomm, &

&

npoints

,

nglopoints

,

nlocmsk

,

ngloind

, & ! IFS

& nopoints, noglopoints,

omask

,

ogloind

, & ! NEMO

&

remap_gauss_to_T

,

gausstoT

, …)

CALL

scripremap_dealloc

(

remap_gauss_to_T

)

where:

remap_gauss_to_T : Structure containing interpolation weightsgausstoT : Structure containing the information for parallel interpolationThis is done for all relevant grid (IFS-NEMO, WAM-NEMO) combinations.Interpolation weights are computed outside the coupled model

10

Slide11

Coupled running of

NEMO (couplnemo)

How the

data to NEMO is transferred, the NEMO time-stepped and fields received with the IFSInterface routines are in

blue

! Update NEMO forcing fields CALL NEMOGCMCOUP_UPDATE( MYPROC-1, NPROC, MPL_COMM, NGPTOT, & & ZSTRSU, ZSTRSV, ZFRSOS, ZFCHAS, ZFHUMS, &

& KSTEP, LNEMOFLUXNC )

! NEMO time stepping

DO JSTPNEMO=NEMOCSTEP,NEMOCSTEP+NEMONSTEP-1

! Advance the NEMO model 1 time step

CALL

NEMOGCMCOUP_STEP

( JSTPNEMO, IDATE, ITIME )

ENDDO

! Update IFS coupling fields

CALL NEMOGCMCOUP_GET

( MYPROC-1, NPROC, MPL_COMM, &

& NGPTOT, ZGSST, ZGICE, ZGUCUR, ZGVCUR

)

In

NEMOGCMCOUP_UPDATE

the

arguments

are

regridding

and

relevant variables in NEMO updated

The routine

NEMOGCMCOUP_STEP

basically just call the standard NEMO time

stepIn NEMOGCMCOUP_GET the NEMO fields are regridded onto the arguments variable on the atmosphere grid11

Slide12

The actual parallel interpolation/regridding:

Parallel regridding routine (

pinfo=gausstoT, gausstoU,…):

SUBROUTINE parinter_fld

(

mype

, nproc, mpi_comm, & & pinfo, nsrclocpoints,

zsrc

,

ndstlocpoints

,

zdst

)

DO

i

=1,pinfo%nsendtot

zsend(i)=zsrc(pinfo%send_address

(

i

))

ENDDO

CALL

mpi_alltoallv

(&

&

zsend,pinfo%nsend

(0:nproc-1), &

&

pinfo%nsdisp

(0:nproc-1),

mpi_double_precision

, &

& zrecv,pinfo%nrecv(0:nproc-1), & & pinfo%nrdisp(0:nproc-1),mpi_double_precision, & & mpi_comm,istatus) zdst(:)=0.0

DO

i

=1,pinfo%num_links

zdst

(pinfo%dst_address(i)) = zdst(pinfo%dst_address(i)) + & & pinfo%remap_matrix(1,i)*zrecv(pinfo%src_address(i)) END DO END SUBROUTINE parinter_fldCalled by NEMOGCMCOUP_UPDATE and NEMOGCMCOUP_GET routines.

12

Slide13

Scalability of the coupled

model

Scalability runs for a 10 day period ofENS resolution (T639_L91 or around 31 km)

The ocean and atmosphere is about equal in costHRES resolution (T1279_L137 or around 15 km)

The ocean is cheap compared to the atmosphere

All run was done on the production Cray XC-30 systems at ECMWF means interference from other running jobs on file system load

etcInitialization is ignoredNo output, but normal inputCoupling overhead were investigated by changing the frequency of couplingThis changes the solution, but frequent coupling is a goalThis was done without the ice model

since we call the ice model every coupled time step so the coupled system will increase in cost to due increase cost of LIM2

A production rate of 240 forecasts days per day is required for operations

13

Slide14

Scalability of the coupled model at ENS resolutions

14

Slide15

Effect of coupling frequency: No measureable slowdown.

15

Slide16

Scalability of the coupled model at

HRES resolutions

16

Slide17

Effect of coupling frequency: No measureable slowdown

17

Slide18

Scalability observations

The coupled model scale reasonable well, but there are room for improvements (as always)

In principle we could implement a coupled HRES system today (in terms of run times only)Very little overhead in the actual coupling on a production system

More frequent coupling should be more expensiveA dedicated system might reveal some difference

High frequent coupled current means calling the ice model more frequent since it is called at every coupling step

Makes physical sense since the forcing fields of the ice model are updated

But at an added cost18

Slide19

Challenges for our coupled model: grids.

Top left Gaussian N128 reduced atmosphere grid

Top right ORCA1 ocean gridBottom left 1.0 degree reduced wave grid16 domains for all

gridsEach colour represent a domain

19

Slide20

Challenges for our coupled model: grids (2)

20

Grids from previous slide

16 point stencil used in interpolationLittle overlap of areas means all interpolations needs communication

Short messages of the order of Kbytes

Packing all fields together could be done to decrease the number of exchanges

Especially important when coupling with the LIM ice modelA solution could be to reshuffle domains in NEMO, but that would require significant changes to the NEMO model

Slide21

Conclusions and outlook

The single executable coupled model works reasonable well for the current operational resolutions

As always: improvements are possibleWith our setup the coupled model almost don’t feel like a coupled model

No fundamental technical difference between calling the atmospheric physics and the ocean model (besides the

regridding

)

The actual “coupling” code is quite small (~2700 lines of code) with about 500 lines of changes to NEMO.It have been used for ensemble medium range forecasting (ENS) for close to a year now with a low resolution (1 degree) ocean configurationWork on integrating the coupled model in other systems has been done or is ongoingWeakly coupled data assimilation prototype has developed in the context

of reanalysis

More work on high resolution coupled modeling will be done in the near future

21

Slide22

Questions?

22