Home

Awesome

OrdinalEntropy

The official code of "Improving Deep Regression with Ordinal Entropy" in ICLR 2023. [PDF].

We currently provide a detailed code for experiments on the synthetic dataset, with a new visualization experiments for easy reproduction.

Experiments on the synthetic dataset

Obtain experiments results on the synthetic dataset

Visualization experiment on the synthetic dataset

We add a new visualization experiment with the synthetic dataset for easy reproduction, as the visualization experiments in our paper is on depth estimation task, which may take some effort to reproduce.

Dataset

For the Linear task:

For the non-linear task:

The dataset above is generated with this code: DeepONet.

Experiments on the Depth Estimation and Crowd Counting

The code for the Depth Baseline can be found here:

The code for the Crowd Counting Baseline can be found here:

The ordinal entropy code for the two tasks can be found here:

The ordinal entropy can be added into the New-CRFs and CSRNet baselines by:

        returen x

to

        if self.training:
            return x, encoding
        else:
            return x
outputs = model(inputs, targets, epoch)

to

outputs, features = model(inputs, targets, epoch)
oe_loss = ordinalentropy(features, targets)
loss = loss + oe_loss

Visualization results on depth-estimation with NYU-v2

The visualization results can be obtained by:

Experiments on the Age Estimation

The code for the Baseline can be found here:

The ordinal entropy code for Age Estimation can be found here:

The ordinal entropy can be added into the Age Estimation baselines in a similar way shown above.

Reference

S. Zhang, L. Yang, M. Bi Mi, X. Zheng, A. Yao, "Improving Deep Regression with Ordinal Entropy," in ICLR, 2023. [PDF].