-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NaNs in samples.tsv #695
Comments
It does not impact the downstream pipline. If the filename is empty (NA) the default path in atlas is used and the pipeline should work. However, it would still be better to write the correct names there. You said there are only some NA rows. So you know the pattern for imputing them. I keep this issue open, and try to fix it in a later version. |
I'll continue writing here. I'm rerunning atlas, this time with many more samples and atlas is crashing on "BG1", "BG2" ... "BG10", "BGmock" There are no NaNs in this column and no empty strings. BinGroups are <150 in size. Occasionally it will also produce the following error
|
I guess where the bug is. |
Sent! When I started the run for the first time, atlas had generated a short string in a new row at the end of the csv which I believe caused the first error, so I promptly removed it. After each |
the step I suggest to use this script to move the files yourself.
I will then send you a correct sample.tsv |
fixed by atlas v 2.18.1 |
Hello Silas. I was able to run atlas 2.18.0 however I noticed for some of my samples in
samples.tsv
there are NaNs in the following columns.Reads_QC_R1
Reads_QC_R2
Reads_QC_se
Not really sure what this means, as the files do exist at corresponding paths for other files. All other columns are filled out appropriately. Writing to see if there is perhaps something I missed and whether this would've had any affect on the downstreams processing. I have QC stats, assemblies, bins and mapping counts for these samples so I'm abit confused. Thanks!
** Atlas version**
2.18.0
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: