Skip to content

External forcings file

The external forcing .ext file contains the forcing data for a D-Flow FM model. This includes open boundaries, lateral discharges and meteorological forcings. The documentation below only concerns the 'old' format (ExtForceFile in the MDU file).

Model

Enum class containing the valid values for the Spatial parameter category of the external forcings.

For more details check D-Flow FM User Manual 1D2D, Chapter D.3.1, Table D.2 https://content.oss.deltares.nl/delft3d/D-Flow_FM_User_Manual_1D2D.pdf

ExtOldForcing

Bases: BaseModel

Class holding the external forcing values.

This class is used to represent the external forcing values in the D-Flow FM model.

Attributes:

Name Type Description
quantity Union[ExtOldQuantity, str]

The name of the quantity.

filename Union[PolyFile, TimModel, DiskOnlyFileModel]

The file associated with this forcing.

varname Optional[str]

The variable name used in filename associated with this forcing; some input files may contain multiple variables.

sourcemask DiskOnlyFileModel

The file containing a mask.

filetype ExtOldFileType

Indication of the file type. Options: 1. Time series 2. Time series magnitude and direction 3. Spatially varying weather 4. ArcInfo 5. Spiderweb data (cyclones) 6. Curvilinear data 7. Samples (C.3) 8. Triangulation magnitude and direction 9. Polyline (<*.pli>-file, C.2) 11. NetCDF grid data (e.g. meteo fields) 14. NetCDF wave data

method ExtOldMethod

The method of interpolation. Options: 1. Pass through (no interpolation) 2. Interpolate time and space 3. Interpolate time and space, save weights 4. Interpolate space 5. Interpolate time 6. Averaging space 7. Interpolate/Extrapolate time

extrapolation_method Optional[ExtOldExtrapolationMethod]

The extrapolation method. Options: 0. No spatial extrapolation. 1. Do spatial extrapolation outside of source data bounding box.

maxsearchradius Optional[float]

Search radius for model grid points that lie outside of the source data bounding box.

operand Operand

The operand to use for adding the provided values.

Options: 'O' Existing values are overwritten with the provided values. 'A' Provided values are used where existing values are missing. '+' Existing values are summed with the provided values. '*' Existing values are multiplied with the provided values. 'X' The maximum values of the existing values and provided values are used. 'N' The minimum values of the existing values and provided values are used.

value Optional[float]

Custom coefficients for transformation.

factor Optional[float]

The conversion factor.

ifrctyp Optional[float]

The friction type.

averagingtype Optional[float]

The averaging type.

relativesearchcellsize Optional[float]

The relative search cell size for samples inside a cell.

extrapoltol Optional[float]

The extrapolation tolerance.

percentileminmax Optional[float]

Changes the min/max operator to an average of the highest/lowest data points. The value sets the percentage of the total set that is to be included.

area Optional[float]

The area for sources and sinks.

nummin Optional[int]

The minimum required number of source data points in each target cell.

Source code in hydrolib/core/dflowfm/extold/models.py
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
class ExtOldForcing(BaseModel):
    """Class holding the external forcing values.

    This class is used to represent the external forcing values in the D-Flow FM model.

    Attributes:
        quantity (Union[ExtOldQuantity, str]):
            The name of the quantity.
        filename (Union[PolyFile, TimModel, DiskOnlyFileModel]):
            The file associated with this forcing.
        varname (Optional[str]):
            The variable name used in `filename` associated with this forcing; some input files may contain multiple variables.
        sourcemask (DiskOnlyFileModel):
            The file containing a mask.
        filetype (ExtOldFileType):
            Indication of the file type.
            Options:
                1. Time series
                2. Time series magnitude and direction
                3. Spatially varying weather
                4. ArcInfo
                5. Spiderweb data (cyclones)
                6. Curvilinear data
                7. Samples (C.3)
                8. Triangulation magnitude and direction
                9. Polyline (<*.pli>-file, C.2)
                11. NetCDF grid data (e.g. meteo fields)
                14. NetCDF wave data
        method (ExtOldMethod):
            The method of interpolation.
            Options:
                1. Pass through (no interpolation)
                2. Interpolate time and space
                3. Interpolate time and space, save weights
                4. Interpolate space
                5. Interpolate time
                6. Averaging space
                7. Interpolate/Extrapolate time
        extrapolation_method (Optional[ExtOldExtrapolationMethod]):
            The extrapolation method.
                Options:
                    0. No spatial extrapolation.
                    1. Do spatial extrapolation outside of source data bounding box.
        maxsearchradius (Optional[float]):
            Search radius for model grid points that lie outside of the source data bounding box.
        operand (Operand):
            The operand to use for adding the provided values.

            Options:
                'O' Existing values are overwritten with the provided values.
                'A' Provided values are used where existing values are missing.
                '+' Existing values are summed with the provided values.
                '*' Existing values are multiplied with the provided values.
                'X' The maximum values of the existing values and provided values are used.
                'N' The minimum values of the existing values and provided values are used.
        value (Optional[float]):
            Custom coefficients for transformation.
        factor (Optional[float]):
            The conversion factor.
        ifrctyp (Optional[float]):
            The friction type.
        averagingtype (Optional[float]):
            The averaging type.
        relativesearchcellsize (Optional[float]):
            The relative search cell size for samples inside a cell.
        extrapoltol (Optional[float]):
            The extrapolation tolerance.
        percentileminmax (Optional[float]):
            Changes the min/max operator to an average of the highest/lowest data points.
            The value sets the percentage of the total set that is to be included.
        area (Optional[float]):
            The area for sources and sinks.
        nummin (Optional[int]):
            The minimum required number of source data points in each target cell.
    """

    quantity: Union[ExtOldQuantity, str] = Field(alias="QUANTITY")
    filename: Union[PolyFile, TimModel, DiskOnlyFileModel] = Field(
        None, alias="FILENAME"
    )
    varname: Optional[str] = Field(None, alias="VARNAME")
    sourcemask: DiskOnlyFileModel = Field(
        default_factory=lambda: DiskOnlyFileModel(None), alias="SOURCEMASK"
    )
    filetype: ExtOldFileType = Field(alias="FILETYPE")
    method: ExtOldMethod = Field(alias="METHOD")
    extrapolation_method: Optional[ExtOldExtrapolationMethod] = Field(
        None, alias="EXTRAPOLATION_METHOD"
    )

    maxsearchradius: Optional[float] = Field(None, alias="MAXSEARCHRADIUS")
    operand: Operand = Field(alias="OPERAND")
    value: Optional[float] = Field(None, alias="VALUE")
    factor: Optional[float] = Field(None, alias="FACTOR")
    ifrctyp: Optional[float] = Field(None, alias="IFRCTYP")
    averagingtype: Optional[float] = Field(None, alias="AVERAGINGTYPE")

    relativesearchcellsize: Optional[float] = Field(
        None, alias="RELATIVESEARCHCELLSIZE"
    )
    extrapoltol: Optional[float] = Field(None, alias="EXTRAPOLTOL")
    percentileminmax: Optional[float] = Field(None, alias="PERCENTILEMINMAX")
    area: Optional[float] = Field(None, alias="AREA")
    nummin: Optional[int] = Field(None, alias="NUMMIN")

    tracerfallvelocity: Optional[float] = Field(None, alias="TRACERFALLVELOCITY")
    tracerdecaytime: Optional[float] = Field(None, alias="TRACERDECAYTIME")

    def is_intermediate_link(self) -> bool:
        return True

    @model_validator(mode="before")
    @classmethod
    def handle_case_insensitive_tracer_fields(cls, values):
        """Handle case-insensitive matching for tracer fields."""
        values_copy = dict(values)

        # Define the field names and their aliases
        tracer_fields = ["tracerfallvelocity", "tracerdecaytime"]

        for field_i in values.keys():
            if field_i.lower() in tracer_fields:
                # if the field is already lowercase no need to change it
                if field_i != field_i.lower():
                    # If the key is not in the expected lowercase format, add it with the correct format
                    values_copy[field_i.lower()] = values_copy[field_i]
                    # Remove the original key to avoid "extra fields not permitted" error
                    values_copy.pop(field_i)

        return values_copy

    @classmethod
    def validate_quantity_prefix(
        cls, lower_value: str, value_str: str
    ) -> Optional[str]:
        """Checks if the provided quantity string starts with any known valid prefix.

        If the quantity matches a prefix, ensures it is followed by a name.
        Returns the full quantity string if valid, otherwise None.

        Args:
            lower_value (str): The quantity string in lowercase.
            value_str (str): The original quantity string.

        Raises:
            ValueError: If the quantity is only the prefix without a name.
        """
        value = None
        for prefix in ALL_PREFIXES:
            if lower_value.startswith(prefix):
                n = len(prefix)
                if n == len(value_str):
                    raise ValueError(
                        f"QUANTITY '{value_str}' should be appended with a valid name."
                    )
                value = prefix + value_str[n:]
                break

        return value

    @field_validator("quantity", mode="before")
    @classmethod
    def validate_quantity(cls, value) -> Any:
        if not isinstance(value, ExtOldQuantity):
            found = False
            value_str = str(value)
            lower_value = value_str.lower()

            quantity = cls.validate_quantity_prefix(lower_value, value_str)
            if quantity is not None:
                value = quantity
                found = True
            elif lower_value in list(ExtOldQuantity):
                value = ExtOldQuantity(lower_value)
                found = True
            elif value_str in list(ExtOldQuantity):
                value = ExtOldQuantity(value_str)
                found = True

            if not found:
                supported_value_str = ", ".join(([x.value for x in ExtOldQuantity]))
                raise ValueError(
                    f"QUANTITY '{value_str}' not supported. Supported values: {supported_value_str}"
                )
        return value

    @field_validator("operand", mode="before")
    @classmethod
    def validate_operand(cls, value):
        return enum_value_parser(value, Operand)

    @model_validator(mode="after")
    def validate_varname(self):
        if self.varname and self.filetype != ExtOldFileType.NetCDFGridData:
            raise ValueError(
                "VARNAME only allowed when FILETYPE is 11 (NetCDFGridData)"
            )
        return self

    @field_validator("extrapolation_method")
    @classmethod
    def validate_extrapolation_method(cls, v, info):
        method = info.data.get("method")
        valid_extrapolation_method = (
            ExtOldExtrapolationMethod.SpatialExtrapolationOutsideOfSourceDataBoundingBox
        )
        available_extrapolation_methods = [
            ExtOldMethod.InterpolateTimeAndSpaceSaveWeights,
            ExtOldMethod.Obsolete,
        ]
        if (
            v == valid_extrapolation_method
            and method not in available_extrapolation_methods
        ):
            raise ValueError(
                f"EXTRAPOLATION_METHOD only allowed to be {valid_extrapolation_method} when METHOD is "
                f"{available_extrapolation_methods[0]} or {available_extrapolation_methods[1]}"
            )
        return v

    @model_validator(mode="after")
    def validate_factor(self):
        quantity = self.quantity
        if self.factor is not None and not str(quantity).startswith(INITIALTRACER):
            raise ValueError(
                f"FACTOR only allowed when QUANTITY starts with {INITIALTRACER}"
            )
        return self

    @model_validator(mode="after")
    def validate_ifrctyp(self):
        if (
            self.ifrctyp is not None
            and self.quantity != ExtOldQuantity.FrictionCoefficient
        ):
            raise ValueError(
                f"IFRCTYP only allowed when QUANTITY is {ExtOldQuantity.FrictionCoefficient}"
            )
        return self

    @model_validator(mode="after")
    def validate_averagingtype(self):
        if (
            self.averagingtype is not None
            and self.method != ExtOldMethod.AveragingSpace
        ):
            raise ValueError(
                f"AVERAGINGTYPE only allowed when METHOD is {ExtOldMethod.AveragingSpace}"
            )
        return self

    @model_validator(mode="after")
    def validate_relativesearchcellsize(self):
        if (
            self.relativesearchcellsize is not None
            and self.method != ExtOldMethod.AveragingSpace
        ):
            raise ValueError(
                f"RELATIVESEARCHCELLSIZE only allowed when METHOD is {ExtOldMethod.AveragingSpace}"
            )
        return self

    @model_validator(mode="after")
    def validate_extrapoltol(self):
        if self.extrapoltol is not None and self.method != ExtOldMethod.InterpolateTime:
            raise ValueError("EXTRAPOLTOL only allowed when METHOD is 5")
        return self

    @model_validator(mode="after")
    def validate_percentileminmax(self):
        if (
            self.percentileminmax is not None
            and self.method != ExtOldMethod.AveragingSpace
        ):
            raise ValueError(
                f"PERCENTILEMINMAX only allowed when METHOD is {ExtOldMethod.AveragingSpace}"
            )
        return self

    @model_validator(mode="after")
    def validate_area(self):
        if (
            self.area is not None
            and self.quantity != ExtOldQuantity.DischargeSalinityTemperatureSorSin
        ):
            raise ValueError(
                f"AREA only allowed when QUANTITY is {ExtOldQuantity.DischargeSalinityTemperatureSorSin}"
            )
        return self

    @model_validator(mode="after")
    def validate_nummin(self):
        if self.nummin is not None and self.method != ExtOldMethod.AveragingSpace:
            raise ValueError(
                f"NUMMIN only allowed when METHOD is {ExtOldMethod.AveragingSpace}"
            )
        return self

    @field_validator("maxsearchradius")
    def validate_maxsearchradius(cls, v, info):
        if v is not None:
            extrap = info.data.get("extrapolation_method")
            extrapolation_method_value = (
                ExtOldExtrapolationMethod.SpatialExtrapolationOutsideOfSourceDataBoundingBox
            )
            if extrap != extrapolation_method_value:
                raise ValueError(
                    f"MAXSEARCHRADIUS only allowed when EXTRAPOLATION_METHOD is {extrapolation_method_value}"
                )
        return v

    @field_validator("value")
    @classmethod
    def validate_value(cls, v, info):
        if v is not None:
            method = info.data.get("method")
            if method != ExtOldMethod.InterpolateSpace:
                raise ValueError(
                    f"VALUE only allowed when METHOD is {ExtOldMethod.InterpolateSpace} (InterpolateSpace)"
                )
        return v

    @model_validator(mode="before")
    @classmethod
    def choose_file_model(cls, values: Dict[str, Any]) -> Dict[str, Any]:
        """Root-level validator to the right class for the filename parameter based on the filetype.

        The validator chooses the right class for the filename parameter based on the FileType_FileModel_mapping
        dictionary.

        FileType_FileModel_mapping = {
            1: TimModel,
            2: TimModel,
            3: DiskOnlyFileModel,
            4: DiskOnlyFileModel,
            5: DiskOnlyFileModel,
            6: DiskOnlyFileModel,
            7: DiskOnlyFileModel,
            8: DiskOnlyFileModel,
            9: PolyFile,
            10: PolyFile,
            11: DiskOnlyFileModel,
            12: DiskOnlyFileModel,
        }
        """
        # if the filetype and the filename are present in the values
        if any(par in values for par in ["filetype", "FILETYPE"]) and any(
            par in values for par in ["filename", "FILENAME"]
        ):
            file_type_var_name = "filetype" if "filetype" in values else "FILETYPE"
            filename_var_name = "filename" if "filename" in values else "FILENAME"
            file_type = values.get(file_type_var_name)
            raw_path = values.get(filename_var_name)

            if isinstance(raw_path, (Path, str)):
                model = FILETYPE_FILEMODEL_MAPPING.get(int(file_type))
                values[filename_var_name] = resolve_file_model(raw_path, model)

        return values

    @model_validator(mode="before")
    @classmethod
    def validate_sourcemask(cls, data: Any) -> Any:
        filetype = data.get("filetype")
        sourcemask = data.get("sourcemask")

        # Convert string to DiskOnlyFileModel if needed
        if isinstance(sourcemask, str):
            data["sourcemask"] = DiskOnlyFileModel(sourcemask)
            sourcemask = data["sourcemask"]

        if sourcemask and filetype not in [
            ExtOldFileType.ArcInfo,
            ExtOldFileType.CurvilinearData,
        ]:
            raise ValueError("SOURCEMASK only allowed when FILETYPE is 4 or 6")
        return data

choose_file_model(values) classmethod

Root-level validator to the right class for the filename parameter based on the filetype.

The validator chooses the right class for the filename parameter based on the FileType_FileModel_mapping dictionary.

FileType_FileModel_mapping = { 1: TimModel, 2: TimModel, 3: DiskOnlyFileModel, 4: DiskOnlyFileModel, 5: DiskOnlyFileModel, 6: DiskOnlyFileModel, 7: DiskOnlyFileModel, 8: DiskOnlyFileModel, 9: PolyFile, 10: PolyFile, 11: DiskOnlyFileModel, 12: DiskOnlyFileModel, }

Source code in hydrolib/core/dflowfm/extold/models.py
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
@model_validator(mode="before")
@classmethod
def choose_file_model(cls, values: Dict[str, Any]) -> Dict[str, Any]:
    """Root-level validator to the right class for the filename parameter based on the filetype.

    The validator chooses the right class for the filename parameter based on the FileType_FileModel_mapping
    dictionary.

    FileType_FileModel_mapping = {
        1: TimModel,
        2: TimModel,
        3: DiskOnlyFileModel,
        4: DiskOnlyFileModel,
        5: DiskOnlyFileModel,
        6: DiskOnlyFileModel,
        7: DiskOnlyFileModel,
        8: DiskOnlyFileModel,
        9: PolyFile,
        10: PolyFile,
        11: DiskOnlyFileModel,
        12: DiskOnlyFileModel,
    }
    """
    # if the filetype and the filename are present in the values
    if any(par in values for par in ["filetype", "FILETYPE"]) and any(
        par in values for par in ["filename", "FILENAME"]
    ):
        file_type_var_name = "filetype" if "filetype" in values else "FILETYPE"
        filename_var_name = "filename" if "filename" in values else "FILENAME"
        file_type = values.get(file_type_var_name)
        raw_path = values.get(filename_var_name)

        if isinstance(raw_path, (Path, str)):
            model = FILETYPE_FILEMODEL_MAPPING.get(int(file_type))
            values[filename_var_name] = resolve_file_model(raw_path, model)

    return values

handle_case_insensitive_tracer_fields(values) classmethod

Handle case-insensitive matching for tracer fields.

Source code in hydrolib/core/dflowfm/extold/models.py
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
@model_validator(mode="before")
@classmethod
def handle_case_insensitive_tracer_fields(cls, values):
    """Handle case-insensitive matching for tracer fields."""
    values_copy = dict(values)

    # Define the field names and their aliases
    tracer_fields = ["tracerfallvelocity", "tracerdecaytime"]

    for field_i in values.keys():
        if field_i.lower() in tracer_fields:
            # if the field is already lowercase no need to change it
            if field_i != field_i.lower():
                # If the key is not in the expected lowercase format, add it with the correct format
                values_copy[field_i.lower()] = values_copy[field_i]
                # Remove the original key to avoid "extra fields not permitted" error
                values_copy.pop(field_i)

    return values_copy

validate_quantity_prefix(lower_value, value_str) classmethod

Checks if the provided quantity string starts with any known valid prefix.

If the quantity matches a prefix, ensures it is followed by a name. Returns the full quantity string if valid, otherwise None.

Parameters:

Name Type Description Default
lower_value str

The quantity string in lowercase.

required
value_str str

The original quantity string.

required

Raises:

Type Description
ValueError

If the quantity is only the prefix without a name.

Source code in hydrolib/core/dflowfm/extold/models.py
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
@classmethod
def validate_quantity_prefix(
    cls, lower_value: str, value_str: str
) -> Optional[str]:
    """Checks if the provided quantity string starts with any known valid prefix.

    If the quantity matches a prefix, ensures it is followed by a name.
    Returns the full quantity string if valid, otherwise None.

    Args:
        lower_value (str): The quantity string in lowercase.
        value_str (str): The original quantity string.

    Raises:
        ValueError: If the quantity is only the prefix without a name.
    """
    value = None
    for prefix in ALL_PREFIXES:
        if lower_value.startswith(prefix):
            n = len(prefix)
            if n == len(value_str):
                raise ValueError(
                    f"QUANTITY '{value_str}' should be appended with a valid name."
                )
            value = prefix + value_str[n:]
            break

    return value

ExtOldModel

Bases: ParsableFileModel

The overall external forcings model that contains the contents of one external forcings file (old format).

This model is typically referenced under a FMModel.external_forcing.extforcefile.

Attributes:

Name Type Description
comment List[str]

The comments in the header of the external forcing file.

forcing List[ExtOldForcing]

The external forcing/QUANTITY blocks in the external forcing file.

Source code in hydrolib/core/dflowfm/extold/models.py
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
class ExtOldModel(ParsableFileModel):
    """
    The overall external forcings model that contains the contents of one external forcings file (old format).

    This model is typically referenced under a [FMModel][hydrolib.core.dflowfm.mdu.models.FMModel]`.external_forcing.extforcefile`.

    Attributes:
        comment (List[str]):
            The comments in the header of the external forcing file.
        forcing (List[ExtOldForcing]):
            The external forcing/QUANTITY blocks in the external forcing file.
    """

    comment: List[str] = Field(default=HEADER.splitlines()[1:])
    forcing: List[ExtOldForcing] = Field(default_factory=list)

    @classmethod
    def _ext(cls) -> str:
        return ".ext"

    @classmethod
    def _filename(cls) -> str:
        return "externalforcings"

    @classmethod
    def _get_serializer(
        cls,
    ) -> Callable[[Path, Dict, SerializerConfig, ModelSaveSettings], None]:
        return Serializer.serialize

    @classmethod
    def _get_parser(cls) -> Callable[[Path], Dict]:
        return Parser.parse

    @property
    def quantities(self) -> List[str]:
        """List all the quantities in the external forcings file."""
        return [forcing.quantity for forcing in self.forcing]

quantities property

List all the quantities in the external forcings file.