ObjectDetectionLabels#

class ObjectDetectionLabels[source]#

Bases: Labels

A set of boxes and associated class_ids and scores.

Implemented using the Tensorflow Object Detection API’s BoxList class.

__init__(npboxes: array, class_ids: array, scores: array = None)[source]#

Construct a set of object detection labels.

Parameters
  • npboxes (array) – float numpy array of size nx4 with cols ymin, xmin, ymax, xmax. Should be in pixel coordinates within the global frame of reference.

  • class_ids (array) – int numpy array of size n with class ids

  • scores (array) – float numpy array of size n

Methods

__init__(npboxes, class_ids[, scores])

Construct a set of object detection labels.

assert_equal(expected_labels)

concatenate(labels1, labels2)

Return concatenation of labels.

filter_by_aoi(aoi_polygons)

Return a copy of these labels filtered by given AOI polygons.

from_boxlist(boxlist)

Make ObjectDetectionLabels from BoxList object.

from_geojson(geojson[, bbox, ioa_thresh, clip])

Convert GeoJSON to ObjectDetectionLabels object.

from_predictions(windows, predictions)

Instantiate from windows and their corresponding predictions.

get_boxes()

Return list of Boxes.

get_class_ids()

get_npboxes()

get_overlapping(labels, window[, ...])

Return subset of labels that overlap with window.

get_scores()

global_to_local(npboxes, window)

Convert from global to local coordinates.

local_to_global(npboxes, window)

Convert from local to global coordinates.

local_to_normalized(npboxes, window)

Convert from local to normalized coordinates.

make_empty()

Instantiate an empty instance of this class.

normalized_to_local(npboxes, window)

Convert from normalized to local coordinates.

prune_duplicates(labels, score_thresh, ...)

Remove duplicate boxes via non-maximum suppression.

save(uri, class_config, crs_transformer[, bbox])

Save labels as a GeoJSON file.

to_boxlist()

to_dict([round_boxes])

Returns a dict version of these labels.

__add__(other: ObjectDetectionLabels) ObjectDetectionLabels[source]#

Add labels to these labels.

Returns a concatenation of this and the other labels.

Parameters

other (ObjectDetectionLabels) –

Return type

ObjectDetectionLabels

__getitem__(window: Box) ObjectDetectionLabels[source]#
Parameters

window (Box) –

Return type

ObjectDetectionLabels

__init__(npboxes: array, class_ids: array, scores: array = None)[source]#

Construct a set of object detection labels.

Parameters
  • npboxes (array) – float numpy array of size nx4 with cols ymin, xmin, ymax, xmax. Should be in pixel coordinates within the global frame of reference.

  • class_ids (array) – int numpy array of size n with class ids

  • scores (array) – float numpy array of size n

__setitem__(window: Box, item: Dict[str, ndarray])[source]#
Parameters
assert_equal(expected_labels: ObjectDetectionLabels)[source]#
Parameters

expected_labels (ObjectDetectionLabels) –

static concatenate(labels1: ObjectDetectionLabels, labels2: ObjectDetectionLabels) ObjectDetectionLabels[source]#

Return concatenation of labels.

Parameters
Return type

ObjectDetectionLabels

filter_by_aoi(aoi_polygons: Iterable[Polygon])[source]#

Return a copy of these labels filtered by given AOI polygons.

Parameters

aoi_polygons (Iterable[Polygon]) – List of AOI polygons to filter by, in pixel coordinates.

static from_boxlist(boxlist: NpBoxList)[source]#

Make ObjectDetectionLabels from BoxList object.

Parameters

boxlist (NpBoxList) –

static from_geojson(geojson: dict, bbox: Optional[Box] = None, ioa_thresh: float = 0.8, clip: bool = True) ObjectDetectionLabels[source]#

Convert GeoJSON to ObjectDetectionLabels object.

If bbox is provided, filter out the boxes that lie “more than a little bit” outside the bbox.

Parameters
  • geojson (dict) – (dict) normalized GeoJSON (see VectorSource)

  • bbox (Optional[Box]) – (Box) in pixel coords

  • ioa_thresh (float) –

  • clip (bool) –

Returns

ObjectDetectionLabels

Return type

ObjectDetectionLabels

classmethod from_predictions(windows: Iterable[Box], predictions: Iterable[Any]) Labels#

Instantiate from windows and their corresponding predictions.

This makes no assumptions about the type or format of the predictions. Subclasses should implement the __setitem__ method to correctly handle the predictions.

Parameters
  • windows (Iterable[Box]) – Boxes in pixel coords, specifying chips in the raster.

  • predictions (Iterable[Any]) – The model predictions for each chip specified by the windows.

Returns

An object of the Label subclass on which this method is called.

Return type

Labels

get_boxes() List[Box][source]#

Return list of Boxes.

Return type

List[Box]

get_class_ids() ndarray[source]#
Return type

ndarray

get_npboxes() ndarray[source]#
Return type

ndarray

static get_overlapping(labels: ObjectDetectionLabels, window: Box, ioa_thresh: float = 0.5, clip: bool = False) ObjectDetectionLabels[source]#

Return subset of labels that overlap with window.

Parameters
  • labels (ObjectDetectionLabels) – ObjectDetectionLabels

  • window (Box) – Box

  • ioa_thresh (float) – The minimum intersection-over-area (IOA) for a box to be considered as overlapping. For each box, IOA is defined as the area of the intersection of the box with the window over the area of the box.

  • clip (bool) – If True, clip label boxes to the window.

Return type

ObjectDetectionLabels

get_scores() ndarray[source]#
Return type

ndarray

static global_to_local(npboxes: ndarray, window: Box)[source]#

Convert from global to local coordinates.

The global coordinates are row/col within the extent of a RasterSource. The local coordinates are row/col within the window frame of reference.

Parameters
static local_to_global(npboxes: ndarray, window: Box)[source]#

Convert from local to global coordinates.

The local coordinates are row/col within the window frame of reference. The global coordinates are row/col within the extent of a RasterSource.

Parameters
static local_to_normalized(npboxes: ndarray, window: Box)[source]#

Convert from local to normalized coordinates.

The local coordinates are row/col within the window frame of reference. Normalized coordinates range from 0 to 1 on each (height/width) axis.

Parameters
classmethod make_empty() ObjectDetectionLabels[source]#

Instantiate an empty instance of this class.

Returns

An object of the Label subclass on which this method is called.

Return type

Labels

static normalized_to_local(npboxes: ndarray, window: Box)[source]#

Convert from normalized to local coordinates.

Normalized coordinates range from 0 to 1 on each (height/width) axis. The local coordinates are row/col within the window frame of reference.

Parameters
static prune_duplicates(labels: ObjectDetectionLabels, score_thresh: float, merge_thresh: float, max_output_size: Optional[int] = None) ObjectDetectionLabels[source]#

Remove duplicate boxes via non-maximum suppression.

Parameters
  • labels (ObjectDetectionLabels) – Labels whose boxes are to be pruned.

  • score_thresh (float) – Prune boxes with score less than this threshold.

  • merge_thresh (float) – Prune boxes with intersection-over-union (IOU) greater than this threshold.

  • max_output_size (int) – Maximum number of retained boxes. If None, this is set to len(abels). Defaults to None.

Returns

Pruned labels.

Return type

ObjectDetectionLabels

save(uri: str, class_config: ClassConfig, crs_transformer: CRSTransformer, bbox: Optional[Box] = None) None[source]#

Save labels as a GeoJSON file.

Parameters
  • uri (str) – URI of the output file.

  • class_config (ClassConfig) – ClassConfig to map class IDs to names.

  • crs_transformer (CRSTransformer) – CRSTransformer to convert from pixel-coords to map-coords before saving.

  • bbox (Optional[Box]) – User-specified crop of the extent. Must be provided if the corresponding RasterSource has bbox != extent.

Return type

None

to_boxlist() NpBoxList[source]#
Return type

NpBoxList

to_dict(round_boxes: bool = True) dict[source]#

Returns a dict version of these labels.

The Dict has a Box as a key, and a tuple of (class_id, score) as the values.

Parameters

round_boxes (bool) –

Return type

dict