Skip to content

External forcings file

The external forcing .ext file contains the forcing data for a D-Flow FM model. This includes open boundaries, lateral discharges and meteorological forcings. The documentation below only concerns the 'old' format (ExtForceFile in the MDU file).

Model

Enum class containing the valid values for the Spatial parameter category of the external forcings.

for more details check D-Flow FM User Manual 1D2D, Chapter D.3.1, Table D.2 https://content.oss.deltares.nl/delft3d/D-Flow_FM_User_Manual_1D2D.pdf

ExtOldForcing

Bases: BaseModel

Class holding the external forcing values.

Source code in hydrolib/core/dflowfm/extold/models.py
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
class ExtOldForcing(BaseModel):
    """Class holding the external forcing values."""

    quantity: Union[ExtOldQuantity, str] = Field(alias="QUANTITY")
    """Union[Quantity, str]: The name of the quantity."""

    filename: Union[PolyFile, TimModel, DiskOnlyFileModel] = Field(
        None, alias="FILENAME"
    )
    """Union[PolyFile, TimModel, DiskOnlyFileModel]: The file associated to this forcing."""

    varname: Optional[str] = Field(None, alias="VARNAME")
    """Optional[str]: The variable name used in `filename` associated with this forcing; some input files may contain multiple variables."""

    sourcemask: DiskOnlyFileModel = Field(
        default_factory=lambda: DiskOnlyFileModel(None), alias="SOURCEMASK"
    )
    """DiskOnlyFileModel: The file containing a mask."""

    filetype: ExtOldFileType = Field(alias="FILETYPE")
    """FileType: Indication of the file type.

    Options:
    1. Time series
    2. Time series magnitude and direction
    3. Spatially varying weather
    4. ArcInfo
    5. Spiderweb data (cyclones)
    6. Curvilinear data
    7. Samples (C.3)
    8. Triangulation magnitude and direction
    9. Polyline (<*.pli>-file, C.2)
    11. NetCDF grid data (e.g. meteo fields)
    14. NetCDF wave data
    """

    method: ExtOldMethod = Field(alias="METHOD")
    """ExtOldMethod: The method of interpolation.

    Options:
    1. Pass through (no interpolation)
    2. Interpolate time and space
    3. Interpolate time and space, save weights
    4. Interpolate space
    5. Interpolate time
    6. Averaging space
    7. Interpolate/Extrapolate time
    """

    extrapolation_method: Optional[ExtOldExtrapolationMethod] = Field(
        None, alias="EXTRAPOLATION_METHOD"
    )
    """Optional[ExtOldExtrapolationMethod]: The extrapolation method.

    Options:
    0. No spatial extrapolation.
    1. Do spatial extrapolation outside of source data bounding box.
    """

    maxsearchradius: Optional[float] = Field(None, alias="MAXSEARCHRADIUS")
    """Optional[float]: Search radius (in m) for model grid points that lie outside of the source data bounding box."""

    operand: Operand = Field(alias="OPERAND")
    """Operand: The operand to use for adding the provided values.

    Options:
    'O' Existing values are overwritten with the provided values.
    'A' Provided values are used where existing values are missing.
    '+' Existing values are summed with the provided values.
    '*' Existing values are multiplied with the provided values.
    'X' The maximum values of the existing values and provided values are used.
    'N' The minimum values of the existing values and provided values are used.
    """

    value: Optional[float] = Field(None, alias="VALUE")
    """Optional[float]: Custom coefficients for transformation."""

    factor: Optional[float] = Field(None, alias="FACTOR")
    """Optional[float]: The conversion factor."""

    ifrctyp: Optional[float] = Field(None, alias="IFRCTYP")
    """Optional[float]: The friction type."""

    averagingtype: Optional[float] = Field(None, alias="AVERAGINGTYPE")
    """Optional[float]: The averaging type."""

    relativesearchcellsize: Optional[float] = Field(
        None, alias="RELATIVESEARCHCELLSIZE"
    )
    """Optional[float]: The relative search cell size for samples inside a cell."""

    extrapoltol: Optional[float] = Field(None, alias="EXTRAPOLTOL")
    """Optional[float]: The extrapolation tolerance."""

    percentileminmax: Optional[float] = Field(None, alias="PERCENTILEMINMAX")
    """Optional[float]: Changes the min/max operator to an average of the highest/lowest data points. The value sets the percentage of the total set that is to be included.."""

    area: Optional[float] = Field(None, alias="AREA")
    """Optional[float]: The area for sources and sinks."""

    nummin: Optional[int] = Field(None, alias="NUMMIN")
    """Optional[int]: The minimum required number of source data points in each target cell."""

    tracerfallvelocity: Optional[float] = Field(None, alias="TRACERFALLVELOCITY")
    tracerdecaytime: Optional[float] = Field(None, alias="TRACERDECAYTIME")

    def is_intermediate_link(self) -> bool:
        return True

    @root_validator(pre=True)
    def handle_case_insensitive_tracer_fields(cls, values):
        """Handle case-insensitive matching for tracer fields."""
        values_copy = dict(values)

        # Define the field names and their aliases
        tracer_fields = ["tracerfallvelocity", "tracerdecaytime"]

        for field_i in values.keys():
            if field_i.lower() in tracer_fields:
                # if the field is already lowercase no need to change it
                if field_i != field_i.lower():
                    # If the key is not in the expected lowercase format, add it with the correct format
                    values_copy[field_i.lower()] = values_copy[field_i]
                    # Remove the original key to avoid "extra fields not permitted" error
                    values_copy.pop(field_i)

        return values_copy

    @validator("quantity", pre=True)
    def validate_quantity(cls, value):
        if isinstance(value, ExtOldQuantity):
            return value

        def raise_error_tracer_name(quantity: ExtOldTracerQuantity):
            raise ValueError(
                f"QUANTITY '{quantity}' should be appended with a tracer name."
            )

        if isinstance(value, ExtOldTracerQuantity):
            raise_error_tracer_name(value)

        value_str = str(value)
        lower_value = value_str.lower()

        for tracer_quantity in ExtOldTracerQuantity:
            if lower_value.startswith(tracer_quantity):
                n = len(tracer_quantity)
                if n == len(value_str):
                    raise_error_tracer_name(tracer_quantity)
                return tracer_quantity + value_str[n:]

        if lower_value in list(ExtOldQuantity):
            return ExtOldQuantity(lower_value)

        supported_value_str = ", ".join(([x.value for x in ExtOldQuantity]))
        raise ValueError(
            f"QUANTITY '{value_str}' not supported. Supported values: {supported_value_str}"
        )

    @validator("operand", pre=True)
    def validate_operand(cls, value):
        if isinstance(value, Operand):
            return value

        if isinstance(value, str):

            for operand in Operand:
                if value.lower() == operand.value.lower():
                    return operand

            supported_value_str = ", ".join(([x.value for x in Operand]))
            raise ValueError(
                f"OPERAND '{value}' not supported. Supported values: {supported_value_str}"
            )

        return value

    @root_validator(skip_on_failure=True)
    def validate_forcing(cls, values):
        class _Field:
            def __init__(self, key: str) -> None:
                self.alias = cls.__fields__[key].alias
                self.value = values[key]

        def raise_error_only_allowed_when(
            field: _Field, dependency: _Field, valid_dependency_value: str
        ):
            error = f"{field.alias} only allowed when {dependency.alias} is {valid_dependency_value}"
            raise ValueError(error)

        def only_allowed_when(
            field: _Field, dependency: _Field, valid_dependency_value: Any
        ):
            """This function checks if a particular field is allowed to have a value only when a dependency field has a specific value."""

            if field.value is None or dependency.value == valid_dependency_value:
                return

            raise_error_only_allowed_when(field, dependency, valid_dependency_value)

        quantity = _Field("quantity")
        varname = _Field("varname")
        sourcemask = _Field("sourcemask")
        filetype = _Field("filetype")
        method = _Field("method")
        extrapolation_method = _Field("extrapolation_method")
        maxsearchradius = _Field("maxsearchradius")
        value = _Field("value")
        factor = _Field("factor")
        ifrctype = _Field("ifrctyp")
        averagingtype = _Field("averagingtype")
        relativesearchcellsize = _Field("relativesearchcellsize")
        extrapoltol = _Field("extrapoltol")
        percentileminmax = _Field("percentileminmax")
        area = _Field("area")
        nummin = _Field("nummin")

        only_allowed_when(varname, filetype, ExtOldFileType.NetCDFGridData)

        if sourcemask.value.filepath is not None and filetype.value not in [
            ExtOldFileType.ArcInfo,
            ExtOldFileType.CurvilinearData,
        ]:
            raise_error_only_allowed_when(
                sourcemask, filetype, valid_dependency_value="4 or 6"
            )

        if (
            extrapolation_method.value
            == ExtOldExtrapolationMethod.SpatialExtrapolationOutsideOfSourceDataBoundingBox
            and method.value != ExtOldMethod.InterpolateTimeAndSpaceSaveWeights
            and method.value != ExtOldMethod.Obsolete
        ):
            error = f"{extrapolation_method.alias} only allowed to be 1 when {method.alias} is 3"
            raise ValueError(error)

        only_allowed_when(
            maxsearchradius,
            extrapolation_method,
            ExtOldExtrapolationMethod.SpatialExtrapolationOutsideOfSourceDataBoundingBox,
        )
        only_allowed_when(value, method, ExtOldMethod.InterpolateSpace)

        if factor.value is not None and not quantity.value.startswith(
            ExtOldTracerQuantity.InitialTracer
        ):
            error = f"{factor.alias} only allowed when {quantity.alias} starts with {ExtOldTracerQuantity.InitialTracer}"
            raise ValueError(error)

        only_allowed_when(ifrctype, quantity, ExtOldQuantity.FrictionCoefficient)
        only_allowed_when(averagingtype, method, ExtOldMethod.AveragingSpace)
        only_allowed_when(relativesearchcellsize, method, ExtOldMethod.AveragingSpace)
        only_allowed_when(extrapoltol, method, ExtOldMethod.InterpolateTime)
        only_allowed_when(percentileminmax, method, ExtOldMethod.AveragingSpace)
        only_allowed_when(
            area, quantity, ExtOldQuantity.DischargeSalinityTemperatureSorSin
        )
        only_allowed_when(nummin, method, ExtOldMethod.AveragingSpace)

        return values

    @root_validator(pre=True)
    def choose_file_model(cls, values):
        """Root-level validator to the right class for the filename parameter based on the filetype.

        The validator chooses the right class for the filename parameter based on the FileType_FileModel_mapping
        dictionary.

        FileType_FileModel_mapping = {
            1: TimModel,
            2: TimModel,
            3: DiskOnlyFileModel,
            4: DiskOnlyFileModel,
            5: DiskOnlyFileModel,
            6: DiskOnlyFileModel,
            7: DiskOnlyFileModel,
            8: DiskOnlyFileModel,
            9: PolyFile,
            10: PolyFile,
            11: DiskOnlyFileModel,
            12: DiskOnlyFileModel,
        }
        """
        # if the filetype and the filename are present in the values
        if any(par in values for par in ["filetype", "FILETYPE"]) and any(
            par in values for par in ["filename", "FILENAME"]
        ):
            file_type_var_name = "filetype" if "filetype" in values else "FILETYPE"
            filename_var_name = "filename" if "filename" in values else "FILENAME"
            file_type = values.get(file_type_var_name)
            raw_path = values.get(filename_var_name)
            model = FILETYPE_FILEMODEL_MAPPING.get(int(file_type))

            if not isinstance(raw_path, model):
                raw_path = model(raw_path)

            values[filename_var_name] = raw_path

        return values

area = Field(None, alias='AREA') class-attribute instance-attribute

Optional[float]: The area for sources and sinks.

averagingtype = Field(None, alias='AVERAGINGTYPE') class-attribute instance-attribute

Optional[float]: The averaging type.

extrapolation_method = Field(None, alias='EXTRAPOLATION_METHOD') class-attribute instance-attribute

Optional[ExtOldExtrapolationMethod]: The extrapolation method.

Options: 0. No spatial extrapolation. 1. Do spatial extrapolation outside of source data bounding box.

extrapoltol = Field(None, alias='EXTRAPOLTOL') class-attribute instance-attribute

Optional[float]: The extrapolation tolerance.

factor = Field(None, alias='FACTOR') class-attribute instance-attribute

Optional[float]: The conversion factor.

filename = Field(None, alias='FILENAME') class-attribute instance-attribute

Union[PolyFile, TimModel, DiskOnlyFileModel]: The file associated to this forcing.

filetype = Field(alias='FILETYPE') class-attribute instance-attribute

FileType: Indication of the file type.

Options: 1. Time series 2. Time series magnitude and direction 3. Spatially varying weather 4. ArcInfo 5. Spiderweb data (cyclones) 6. Curvilinear data 7. Samples (C.3) 8. Triangulation magnitude and direction 9. Polyline (<*.pli>-file, C.2) 11. NetCDF grid data (e.g. meteo fields) 14. NetCDF wave data

ifrctyp = Field(None, alias='IFRCTYP') class-attribute instance-attribute

Optional[float]: The friction type.

maxsearchradius = Field(None, alias='MAXSEARCHRADIUS') class-attribute instance-attribute

Optional[float]: Search radius (in m) for model grid points that lie outside of the source data bounding box.

method = Field(alias='METHOD') class-attribute instance-attribute

ExtOldMethod: The method of interpolation.

Options: 1. Pass through (no interpolation) 2. Interpolate time and space 3. Interpolate time and space, save weights 4. Interpolate space 5. Interpolate time 6. Averaging space 7. Interpolate/Extrapolate time

nummin = Field(None, alias='NUMMIN') class-attribute instance-attribute

Optional[int]: The minimum required number of source data points in each target cell.

operand = Field(alias='OPERAND') class-attribute instance-attribute

Operand: The operand to use for adding the provided values.

Options: 'O' Existing values are overwritten with the provided values. 'A' Provided values are used where existing values are missing. '+' Existing values are summed with the provided values. '*' Existing values are multiplied with the provided values. 'X' The maximum values of the existing values and provided values are used. 'N' The minimum values of the existing values and provided values are used.

percentileminmax = Field(None, alias='PERCENTILEMINMAX') class-attribute instance-attribute

Optional[float]: Changes the min/max operator to an average of the highest/lowest data points. The value sets the percentage of the total set that is to be included..

quantity = Field(alias='QUANTITY') class-attribute instance-attribute

Union[Quantity, str]: The name of the quantity.

relativesearchcellsize = Field(None, alias='RELATIVESEARCHCELLSIZE') class-attribute instance-attribute

Optional[float]: The relative search cell size for samples inside a cell.

sourcemask = Field(default_factory=lambda: DiskOnlyFileModel(None), alias='SOURCEMASK') class-attribute instance-attribute

DiskOnlyFileModel: The file containing a mask.

value = Field(None, alias='VALUE') class-attribute instance-attribute

Optional[float]: Custom coefficients for transformation.

varname = Field(None, alias='VARNAME') class-attribute instance-attribute

Optional[str]: The variable name used in filename associated with this forcing; some input files may contain multiple variables.

choose_file_model(values)

Root-level validator to the right class for the filename parameter based on the filetype.

The validator chooses the right class for the filename parameter based on the FileType_FileModel_mapping dictionary.

FileType_FileModel_mapping = { 1: TimModel, 2: TimModel, 3: DiskOnlyFileModel, 4: DiskOnlyFileModel, 5: DiskOnlyFileModel, 6: DiskOnlyFileModel, 7: DiskOnlyFileModel, 8: DiskOnlyFileModel, 9: PolyFile, 10: PolyFile, 11: DiskOnlyFileModel, 12: DiskOnlyFileModel, }

Source code in hydrolib/core/dflowfm/extold/models.py
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
@root_validator(pre=True)
def choose_file_model(cls, values):
    """Root-level validator to the right class for the filename parameter based on the filetype.

    The validator chooses the right class for the filename parameter based on the FileType_FileModel_mapping
    dictionary.

    FileType_FileModel_mapping = {
        1: TimModel,
        2: TimModel,
        3: DiskOnlyFileModel,
        4: DiskOnlyFileModel,
        5: DiskOnlyFileModel,
        6: DiskOnlyFileModel,
        7: DiskOnlyFileModel,
        8: DiskOnlyFileModel,
        9: PolyFile,
        10: PolyFile,
        11: DiskOnlyFileModel,
        12: DiskOnlyFileModel,
    }
    """
    # if the filetype and the filename are present in the values
    if any(par in values for par in ["filetype", "FILETYPE"]) and any(
        par in values for par in ["filename", "FILENAME"]
    ):
        file_type_var_name = "filetype" if "filetype" in values else "FILETYPE"
        filename_var_name = "filename" if "filename" in values else "FILENAME"
        file_type = values.get(file_type_var_name)
        raw_path = values.get(filename_var_name)
        model = FILETYPE_FILEMODEL_MAPPING.get(int(file_type))

        if not isinstance(raw_path, model):
            raw_path = model(raw_path)

        values[filename_var_name] = raw_path

    return values

handle_case_insensitive_tracer_fields(values)

Handle case-insensitive matching for tracer fields.

Source code in hydrolib/core/dflowfm/extold/models.py
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
@root_validator(pre=True)
def handle_case_insensitive_tracer_fields(cls, values):
    """Handle case-insensitive matching for tracer fields."""
    values_copy = dict(values)

    # Define the field names and their aliases
    tracer_fields = ["tracerfallvelocity", "tracerdecaytime"]

    for field_i in values.keys():
        if field_i.lower() in tracer_fields:
            # if the field is already lowercase no need to change it
            if field_i != field_i.lower():
                # If the key is not in the expected lowercase format, add it with the correct format
                values_copy[field_i.lower()] = values_copy[field_i]
                # Remove the original key to avoid "extra fields not permitted" error
                values_copy.pop(field_i)

    return values_copy

ExtOldModel

Bases: ParsableFileModel

The overall external forcings model that contains the contents of one external forcings file (old format).

This model is typically referenced under a FMModel.external_forcing.extforcefile.

Source code in hydrolib/core/dflowfm/extold/models.py
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
class ExtOldModel(ParsableFileModel):
    """
    The overall external forcings model that contains the contents of one external forcings file (old format).

    This model is typically referenced under a [FMModel][hydrolib.core.dflowfm.mdu.models.FMModel]`.external_forcing.extforcefile`.
    """

    comment: List[str] = Field(default=HEADER.splitlines()[1:])
    """List[str]: The comments in the header of the external forcing file."""
    forcing: List[ExtOldForcing] = Field(default_factory=list)
    """List[ExtOldForcing]: The external forcing/QUANTITY blocks in the external forcing file."""

    @classmethod
    def _ext(cls) -> str:
        return ".ext"

    @classmethod
    def _filename(cls) -> str:
        return "externalforcings"

    def dict(self, *args, **kwargs):
        return dict(comment=self.comment, forcing=[dict(f) for f in self.forcing])

    @classmethod
    def _get_serializer(
        cls,
    ) -> Callable[[Path, Dict, SerializerConfig, ModelSaveSettings], None]:
        return Serializer.serialize

    @classmethod
    def _get_parser(cls) -> Callable[[Path], Dict]:
        return Parser.parse

    @property
    def quantities(self) -> List[str]:
        """List all the quantities in the external forcings file."""
        return [forcing.quantity for forcing in self.forcing]

comment = Field(default=HEADER.splitlines()[1:]) class-attribute instance-attribute

List[str]: The comments in the header of the external forcing file.

forcing = Field(default_factory=list) class-attribute instance-attribute

List[ExtOldForcing]: The external forcing/QUANTITY blocks in the external forcing file.

quantities property

List all the quantities in the external forcings file.