This bugzilla service is closed. All entries have been migrated to https://gitlab.com/libeigen/eigen
Bug 1773 - variance of a Tensor
Summary: variance of a Tensor
Status: NEW
Alias: None
Product: Eigen
Classification: Unclassified
Component: Tensor (show other bugs)
Version: 3.5 (future version)
Hardware: x86 - 64-bit Linux
: Normal Feature Request
Assignee: Nobody
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2019-11-05 22:41 UTC by william.tambellini
Modified: 2019-12-04 18:54 UTC (History)
5 users (show)



Attachments

Description william.tambellini 2019-11-05 22:41:18 UTC
Hi
As today there are :
- a Tensor::mean() method to compute the mean of a tensor on a given dim, apriori just calling the MeanReducer : 
    template <typename Dims> EIGEN_DEVICE_FUNC EIGEN_STRONG_INLINE
    const TensorReductionOp<internal::MeanReducer<CoeffReturnType>, const Dims, const Derived>
    mean(const Dims& dims) const {
      return TensorReductionOp<internal::MeanReducer<CoeffReturnType>, const Dims, const Derived>(derived(), dims, internal::MeanReducer<CoeffReturnType>());
    }
 
- some reducers in TensorFunctors.h 
https://eigen.tuxfamily.org/dox-devel/unsupported/TensorFunctors_8h_source.html

but no native way to compute the variance of a Tensor.

I see some variance computation in TF but does nt seem to use Eigen:
https://github.com/tensorflow/tensorflow/tree/a062a8c1f6a89c88a908813add05a0bfd5d523b9/tensorflow/core/lib/histogram

One way would be first call the mean reducer and then computing manually the var by looping manually over all coeffs but not taking advantages of packets.
For the end user, it would perhaps be better to add a Tensor::var() method or at least a VarReducer callable by calling reduce(dim, reducer).

Would it be acceptable to add a VarReducer in TensorFunctors ?
Kind
Comment 1 william.tambellini 2019-11-06 00:43:27 UTC
The (dummy) "manual" implementation on dim d :
   const Eigen::array<Eigen::Index, 1> dim = {d}; // reduce dim
   Eigen::Tensor<T,R-1> mean = in.mean(dim);
   Eigen::array<Eigen::Index, R> ns; // new shape
   for (short i = 0; i < R; ++i)
      ns[i] = (i==d ? 1 : in.dimension(i));
   Eigen::array<Eigen::Index, R> bc;
   for (short i = 0; i < R; ++i)
      bc[i] = (i == d ? in.dimension(i) : 1);
   Eigen::Tensor<T,R> xMinusMean = in - mean.reshape(ns).broadcast(bc);
   Eigen::Tensor<T,R> sumOfSquared = xMinusMean.square().sum(dim);
   out = sumOfSquared / (T)data_->dimension(d);

I m a little worry about the speed. Comparing with eigen Matrix. TBC.
Comment 2 Nobody 2019-12-04 18:54:24 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to gitlab.com's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.com/libeigen/eigen/issues/1773.

Note You need to log in before you can comment on or make changes to this bug.