You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The insert size (fragment length) currently needs to be configured in config.yaml. It is used during normalization to scale the number of reads such that the coverage becomes 1.
Instead, the median or mean fragment length as determined by Picard CollectInsertSizeMetrics could be used for this.
The text was updated successfully, but these errors were encountered:
This should be resolved by #19 since deepTools was crashing sometimes and I took median fragment length from Picard as a parameter value for --extendReads.
Sorry I wasn’t more specific: This issue is about rule compute_scaling_factors, which currently uses config["fragment_size"]. This issue can be closed when that is no longer necessary (and when the fragment_size setting has been removed from config.yaml).
The insert size (fragment length) currently needs to be configured in
config.yaml
. It is used during normalization to scale the number of reads such that the coverage becomes 1.Instead, the median or mean fragment length as determined by Picard CollectInsertSizeMetrics could be used for this.
The text was updated successfully, but these errors were encountered: