QcmpCompressionLibrary issueshttps://code.it4i.cz/BioinformaticDataCompression/QcmpCompressionLibrary/-/issues2020-11-18T12:25:00+01:00https://code.it4i.cz/BioinformaticDataCompression/QcmpCompressionLibrary/-/issues/3Support timepoints in IPlaneLoader API.2020-11-18T12:25:00+01:00Vojtech MoravecSupport timepoints in IPlaneLoader API.Currently `IPlaneLoader` doesn't support choosing the source timepoint. This is necessary addition for more future improvements to QcmpLibrary.
Example:
```java
int[] loadPlaneData(final int plane); --> int[] loadPlaneData(final int tim...Currently `IPlaneLoader` doesn't support choosing the source timepoint. This is necessary addition for more future improvements to QcmpLibrary.
Example:
```java
int[] loadPlaneData(final int plane); --> int[] loadPlaneData(final int timepoint, final int plane);
```Vojtech MoravecVojtech Moravechttps://code.it4i.cz/BioinformaticDataCompression/QcmpCompressionLibrary/-/issues/2Training global codebook from multiple timepoints2020-10-21T09:44:28+02:00Vojtech MoravecTraining global codebook from multiple timepointsCurrently, global codebook is created from all planes in the dataset, **but just from a single timepoint**. This was fine, because our test datasets had only one timepoint.
With the datasets, which have multiple timepoints, this could l...Currently, global codebook is created from all planes in the dataset, **but just from a single timepoint**. This was fine, because our test datasets had only one timepoint.
With the datasets, which have multiple timepoints, this could lead to worse compression results, mainly higher compression error.
First idea is to load all the data from all planes and all timepoints, which would probably result in not enough memory in the system error.
I think the solution is to *sample* the dataset and choose training planes across all timepoints. We would probably create some parameter, which would limit the maximum memory used during compression, which would control the sample size.
In order to implement this, #3 must be finished first.Vojtech MoravecVojtech Moravec