-
Notifications
You must be signed in to change notification settings - Fork 10.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error "no fmt_url_map or conn information found in video info" with specific video #108
Comments
I'm getting the exact same error on this video: |
Yep, this one too: http://www.youtube.com/watch?v=TkPiXRNee7A This patch fixes it, as noted above: diff --git a/bin/youtube-dl b/bin/youtube-dl
|
Any chance a new release could come out with this fix in? |
even with your latest update, it is failing on the new format youtube videos such as: |
@gitupandgo I can reproduce the error with the mentioned youtube URL. The video is completely blocked for requests from Germany, so it'll take some time for me. It looks like there is a promising |
@gitupandgo Should work now, although I haven't downloaded the whole video yet, as my US server's connection is really slow. Can you test again with my fork? |
Yes, that's fixed it, although it now says while downloading that the file size is 2GB, which is a bit odd because it is a live webcam, streaming continuously without any fixed size or time limit. |
@gitupandgo Oh, ok. I'll leave my test running and see if it gets over 2GB. Can you watch more than 2GB at a time? |
@gitupandgo My test finished: youtube-dl creates a 2GB file. I'm suspecting that the youtube flash client requests a new file every 2GB. How do you think youtube-dl should handle those infinite streams? (Reassembling two or more files is quite hard). At the moment, you can call youtube-dl in a loop in a shell script or so. |
I dunno, but I'd say it would be more "Unixy" if youtube-dl was able to handle the situation without stopping after every 2GB. Since the youtube flash player will play a live webcam stream indefinitely without any user interaction to make it continue playing after receiving every 2GB, I think youtube-dl should do the same. youtube-dl could simply append a new suffix when it is saving each 2GB chunk with filenames like filename_1, filename_2, ..., filename_N for the N-th. Would that work? |
or perhaps: filename, filename_2, ..., filename_N |
That's a reasonable proposal (we've already a similar functionality with |
@phihag Your fork works for me as well. |
@phihag, with your version, I could get rid of the error. |
@phihag I tried your patched version and it seems to work. Incidentally, I noticed that there is a release today but it is giving a 404 error. |
youtube-dl is broken! For most videos it gives this ERROR: no fmt_url_map or conn information found in video info. Anyone working on fixing this? |
@anondavid A new version is out by now and should have fixed the problem. If it still persists, open a new bug, and include:
|
Thanks. The new version is working for me now. Will let you know if i hit any problems. |
I've still got problems with this error |
oh wait, I didn't have the updated version. turns out I updated about 10 mins before it was released. |
@phihag thanks mate, excellent work |
|
@philhag |
@gitupandgo Well, there are a lot of corner cases that have to be considered. For example, how do you detect that a stream is infinite? How do you prevent infinite loops of a finite file/stream? What do you do if the title changes from request to request? What do you do on errors? Do you splice the files together? How do you detect a lagging stream? Note that there is no notion of 2GB "files" on youtube's side, it's just packaging the stream into chunks. The problem is not Python-specific at all, but I'm afraid quite complex - too complex for me. I'd appreciate a code suggestion (if possible, with tests). |
Ok, I think I see what you're thinking. But I'm thinking to myself do these sorts of examples of possible problems actually occur on youtube, do they occur often enough to worry about, and should it be youtube-dl's job to worry about any of them? I like the simplest approach of keeping youtube-dl at the minimum complexity for downloading: just blindly download each 2GB chunk into a separate file - don't worry about loops, lags, name changes. Errors are potentially troublesome, though. If an error occurs in downloading one 2GB chunk, just report it and give the user the option of either stopping or continuing. That probably needs further thought. Once youtube-dl has to start worrying about detecting infinite loops in a finite stream, it is getting to be more like an image processing problem, bringing the risks of over-developing / over-complicating youtube-dl. |
it's worked for me : youtube-dl --update |
May I thank you phihag, your version works perfect |
thanks!! |
youtube-dl --update Is the easiest and perfect way to fix that problem |
AFAIK this is fixed. |
i'm still having issues with this. running the latest version. works fine with recorded videos but not working with live streams. $ youtube-dl -U |
Likewise: $ youtube-dl -v --console-title -c -t http://www.youtube.com/watch?v=oLzaaLev-yc |
In particular: |
^ cheriff i successfully captured http://www.youtube.com/watch?v=oLzaaLev-yc with ffmpeg by plugging in the m3u8 file. maybe that's worth looking into for the youtube-dl solution. i did run into an issue where ffmpeg i was capturing to .ts instead of .mp4. i was told to use the "-absf aac_adtstoasc" to do the mp4 capture on the fly. |
Brilliant! I'll give it a shot, an and this will definitely work as a stopgap solution. Thanks, saulbass. |
[zoom] new extractor
Well, with this video http://www.youtube.com/watch?v=1FX4zco0ziY, I was getting an error because it seams that youtube-dl is not finding any valid format.
$ youtube-dl http://www.youtube.com/watch?v=1FX4zco0ziY
[youtube] Setting language
[youtube] 1FX4zco0ziY: Downloading video webpage
[youtube] 1FX4zco0ziY: Downloading video info webpage
[youtube] 1FX4zco0ziY: Extracting video information
ERROR: no fmt_url_map or conn information found in video info
But, debugging I found that the problem seams to happen in the line 1082
if 'fmt_url_map' in video_info and len(video_info['fmt_url_map']) >= 1 and ',' in video_info['fmt_url_map'][0]:
For this specific video, the variable video_info['fmt_url_map'] has value:
['5|http://v10.lscache7.c.youtube.com/videoplayback?sparams=id%2Cexpire%2Cip%2Cipbits%2Citag%2Calgorithm%2Cburst%2Cfactor%2Coc%3AU0hPR1dRVl9FSkNOOV9PS1pB&fexp=901033%2C904531%2C902300&algorithm=throttle-factor&itag=5&ipbits=0&burst=40&sver=3&signature=BF2B1CC7E1919209CAEF7208B1E83C3BC54E50DE.8705DABF3E0172EF8CC92B6828EEFB278BF03D0B&expire=1302768000&key=yt1&ip=0.0.0.0&factor=1.25&id=d455f8cdca34ce26']
So, I just commented the last part of the if that checked for a comma and it worked fine. I don't know why the comma checking is being done, especially because we make a call to split(',') latter. Maybe some special case :D
Well, hope it helps. I'm new to github so I didn't know if there is other way to suggest a code change then creating an issue, so sorry if I posted in the wrong place.
The text was updated successfully, but these errors were encountered: