Fixed
Status Update
Comments
je...@google.com <je...@google.com> #2
[Comment deleted]
[Deleted User] <[Deleted User]> #3
Second app is called "YouTube for Google TV" (https://play.google.com/store/apps/details?id=com.google.android.youtube.googletv ). It's strange that you can't find it on Google Play, maybe it has some regional/device restrictions or something.
je...@google.com <je...@google.com> #5
[ replacing my earlier comment with my email address removed ]
Thank you for this post. I am having the same problem with the default pre-installed version of YouTube on my Android TV Nexus Player. When I attempt to "UPDATE YOUTUBE APP" it doesn't actually update it. If I uninstall and re-install YouTube, it simply re-installs the same version (version 1.0.5.4, updated on Mar 26, 2015, 8.62 MB). If I uninstall that, and search on Google Play Store for the com.google.android.youtube.googletv app recommended above, it doesn't find it - it only finds the first app.
You say "But installing of second app is very inconvenient for users." - I agree - it's so inconvenient, I can't figure out even how to do it.
Would you mind telling me how you did it?
Thank you,
Lloyd
Thank you for this post. I am having the same problem with the default pre-installed version of YouTube on my Android TV Nexus Player. When I attempt to "UPDATE YOUTUBE APP" it doesn't actually update it. If I uninstall and re-install YouTube, it simply re-installs the same version (version 1.0.5.4, updated on Mar 26, 2015, 8.62 MB). If I uninstall that, and search on Google Play Store for the com.google.android.youtube.googletv app recommended above, it doesn't find it - it only finds the first app.
You say "But installing of second app is very inconvenient for users." - I agree - it's so inconvenient, I can't figure out even how to do it.
Would you mind telling me how you did it?
Thank you,
Lloyd
je...@google.com <je...@google.com> #6
Any recommendations on how to implement youtube player on Android TV? I tried with a WebView and I got a white screen and my application just crashed after focusing the WebView
je...@google.com <je...@google.com> #7
No one have any information about this ? At least, know if is an error or other ?
th...@gmail.com <th...@gmail.com> #9
Same problem here... Would be great to have a response from Google.
il...@gmail.com <il...@gmail.com> #11
Thanks shawns. But does it mean it's a bug and it will be fixed (The init error with Update Youtube App make me think that) or google don't want to allow Youtube Player API on Android TV ?
je...@google.com <je...@google.com> #13
Additionally, this problem is affecting end-users as well as developers.
I'm using the API for YouTube thumbnails, and am using the onInitializationFailure callback to detect failure. When a user recoverable error happens, the YouTube app linked to from the Dialog is "not available on this device". Very confusing for users, as they can clearly see that it is there.
@Override
public void onInitializationFailure(YouTubeThumbnailView view, YouTubeInitializationResult loader) {
if (loader.isUserRecoverableError()) {
loader.getErrorDialog(activity, code).show();
}
}
I'm using the API for YouTube thumbnails, and am using the onInitializationFailure callback to detect failure. When a user recoverable error happens, the YouTube app linked to from the Dialog is "not available on this device". Very confusing for users, as they can clearly see that it is there.
@Override
public void onInitializationFailure(YouTubeThumbnailView view, YouTubeInitializationResult loader) {
if (loader.isUserRecoverableError()) {
loader.getErrorDialog(activity, code).show();
}
}
fa...@gmail.com <fa...@gmail.com> #14
Youtube for google tv was replaced by youtube for android tv. so, I try to use YouTube Player API Reference for iframe Embeds and achieved most of all functions basically. But it seems working with a long delay and the style of the layer is not good.
So, do you have the new api for android tv?
and there the address for iframe api:
https://developers.google.com/youtube/iframe_api_reference
So, do you have the new api for android tv?
and there the address for iframe api:
fa...@gmail.com <fa...@gmail.com> #15
Hi everyone, thanks for the reports on this. The YouTubeAPI class is not supported on the Android TV platform, and there are no plans to add support for it at this time.
so...@gmail.com <so...@gmail.com> #16
For your information, I've created YoutubeTV library (https://github.com/bertrandmartel/youtubetv ), a wrapper library for YouTube Player API Reference for iframe Embeds
It provides a YoutubeTvView that embeds a Webview with Youtube iframe with all Javascript API available from this view.
Also, YoutubeTvFragment holds a YoutubeTvView with a media control bar (PlaybackOverlayFragment)
It provides a YoutubeTvView that embeds a Webview with Youtube iframe with all Javascript API available from this view.
Also, YoutubeTvFragment holds a YoutubeTvView with a media control bar (PlaybackOverlayFragment)
ph...@gmail.com <ph...@gmail.com> #17
So ma....@google, tell me please. I'm an end user, and I want to utilize My Amazon Echo thusly: "Alexa, turn on fireplace" and have it launch a YouTube fireplace video. You seem to be saying this cannot be done and will not be accommodated.
Are you folks aware of what's actually occurring out here in the world? This is the future, and can't be difficult to do. Please enable it.
Are you folks aware of what's actually occurring out here in the world? This is the future, and can't be difficult to do. Please enable it.
[Deleted User] <[Deleted User]> #18
je...@google.com <je...@google.com> #19
Hi, ma...@gmail.com - are there any revised plans to support YouTubeAPI class on the Android TV platform?
li...@gmail.com <li...@gmail.com> #20
With API v3, when i search query with "love“ term, e.g. https://www.googleapis.com/youtube/v3/search?part=snippet&maxResults=20&order=relevance&q=love&safeSearch=none&type=video&videoCaption=any&videoLicense=any&videoSyndicated=any&pageToken=CGQQAA&videoType=any&key=API_KEY , i can get 728 results。 After i double check the video id, 29 results is duplicated, so that's mean i've received 699 unique results.
So, i doubt the "maximum of 500 results." is apply on all cases.
So, i doubt the "maximum of 500 results." is apply on all cases.
li...@gmail.com <li...@gmail.com> #21
Also, i decide the end of navigation by the factor when the response "resultsPerPage" field doesn't achieve the "maxResults" API parameter. Even though the "nextPageToken" field was not empty at this point, but i didn't rely on "nextPageToken" to detect last page, because i doubt i would get more duplicate videos if continue with this "nextPageToken".
of...@gmail.com <of...@gmail.com> #22
This is not fixed / not working as expected so why is it closed ?!
hq...@gmail.com <hq...@gmail.com> #23
An issue is not fixed. I can only get 20 pages with 50 results when a total amount of results is 1000000. It's very sad that this issue still not solved, several months have passed since it was found.
je...@google.com <je...@google.com> #24
To update folks on what's going on:
We can't provide more than ~500 search results for any arbitrary YouTube query via the API without the quality of the search results severely degrading (duplicates, etc.).
The v1/v2 GData API was updated back in November to limit the number of search results returned to 500. If you specify a start-index of 500 or more, you won't get back any results.
This was supposed to have also gone into effect for the v3 API (which uses a different method of paging through results) but it apparently was not pushed out, so it is still possible to retrieve up to 1000 search results in v3—the last 500 of which are usually of bad quality.
The change to limit v3 to 500 search results will be pushed out sometime in the near future. There will no longer be nextPageTokens returned once you hit 500 results.
I understand that the totalResults that are returned is much higher than 500 in all of these cases, but that is not the same thing as saying that we can effectively return all X million possible results. It's meant as an estimate of the total size of the set of videos that match a query and normally isn't very useful.
We can't provide more than ~500 search results for any arbitrary YouTube query via the API without the quality of the search results severely degrading (duplicates, etc.).
The v1/v2 GData API was updated back in November to limit the number of search results returned to 500. If you specify a start-index of 500 or more, you won't get back any results.
This was supposed to have also gone into effect for the v3 API (which uses a different method of paging through results) but it apparently was not pushed out, so it is still possible to retrieve up to 1000 search results in v3—the last 500 of which are usually of bad quality.
The change to limit v3 to 500 search results will be pushed out sometime in the near future. There will no longer be nextPageTokens returned once you hit 500 results.
I understand that the totalResults that are returned is much higher than 500 in all of these cases, but that is not the same thing as saying that we can effectively return all X million possible results. It's meant as an estimate of the total size of the set of videos that match a query and normally isn't very useful.
ma...@gmail.com <ma...@gmail.com> #26
The "nextPageToken" mechanism is unbelievable in the first place. In order to get the Nth page, you need to retrieve N pages. Absolutely demential. How is it possible that you replaced the "start-index" parameter with this nonsense?
of...@intellix.eu <of...@intellix.eu> #27
Agreed with #26. I think this API needs some serious QA / testing.
[Deleted User] <[Deleted User]> #28
I think that if you do a search and want only results from the Nth page, then you're doing the search wrong.
The only case I can think of where it might be an issue is if you want to retrieve all result-pages in parallel. I always got the impression the pagetoken for the Nth page is actually always the same; if so, even this isn't an issue.
The only case I can think of where it might be an issue is if you want to retrieve all result-pages in parallel. I always got the impression the pagetoken for the Nth page is actually always the same; if so, even this isn't an issue.
da...@gmail.com <da...@gmail.com> #29
Is there a way to go beyond 500 results when returning all videos from a certain category? I'm trying to get all the videos above a certain view count for various categories. Some of them are returning usable results > 500, so I'm unable to get my full data.
[Deleted User] <[Deleted User]> #30
@#29
If you want more results, you can split the query over multiple time-intervals (using updatedAfter and updatedBefore). If you search per month, you can already get up to 6000 results for a given year.
If you want more results, you can split the query over multiple time-intervals (using updatedAfter and updatedBefore). If you search per month, you can already get up to 6000 results for a given year.
be...@gmail.com <be...@gmail.com> #31
@t...@wuriddles.com (reply #28)
Suppose there's an error condition processing a page of results and you need to restart from a certain page...
Suppose there's an error condition processing a page of results and you need to restart from a certain page...
pa...@gmail.com <pa...@gmail.com> #32
Don't know what is the status (is it really fixed?)... based on the #30 suggestion, I am trying to fetch list of videos from a particular channel, just for one day i.e. Publishedafter and Publishedbefore has a range of 1 day (24 hrs) but still the result is 3000+ videos, that's not possible for a single channel.
so...@gmail.com <so...@gmail.com> #33
@ #32 pankaj...@gmail.com
If you mean the number under pageInfo.totalResults, then I'd ignore that. I haven't found any indication that it's even vaguely correlated to the actual number of results.
(I half suspect it's the count of results before any filters were applied, but that's just speculation.)
If you mean the number under pageInfo.totalResults, then I'd ignore that. I haven't found any indication that it's even vaguely correlated to the actual number of results.
(I half suspect it's the count of results before any filters were applied, but that's just speculation.)
sh...@gmail.com <sh...@gmail.com> #34
[Comment deleted]
zj...@gmail.com <zj...@gmail.com> #35
sad can only get 500 results on a search. Why has this limit ?
jo...@gmail.com <jo...@gmail.com> #36
Please increase the limit.
al...@gmail.com <al...@gmail.com> #37
[Comment deleted]
[Deleted User] <[Deleted User]> #38
I m trying to fetch 2000 channel using search api but after 500 results nextPageToken is null.Why null
al...@gmail.com <al...@gmail.com> #39
I'm also running into this issue when using the subscriptions list endpoint. totalResults correctly indicates the number of subscribers, but after 1000 results the next page token is null. I understand if this is a limitation with the API, but the documentation is misleading.
The description for the myRecentSubscribers field reads: "Note that this parameter only supports retrieval of the most recent 1000 subscribers to the authenticated user's channel. To retrieve a complete list of subscribers, use the mySubscribers parameter." This implies that mySubscribers can be used to fetch the full list of subscribers for one's channel.
Can either the documentation or (preferably) the API be updated? :)
The description for the myRecentSubscribers field reads: "Note that this parameter only supports retrieval of the most recent 1000 subscribers to the authenticated user's channel. To retrieve a complete list of subscribers, use the mySubscribers parameter." This implies that mySubscribers can be used to fetch the full list of subscribers for one's channel.
Can either the documentation or (preferably) the API be updated? :)
ca...@gmail.com <ca...@gmail.com> #40
thanks, google, for my lost time because of your poor documentation on this issue
an...@gmail.com <an...@gmail.com> #41
Я делаю запрос так.
# -*- coding: utf-8 -*-
import os
import google.oauth2.credentials
import google_auth_oauthlib.flow
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
from google_auth_oauthlib.flow import InstalledAppFlow
# The CLIENT_SECRETS_FILE variable specifies the name of a file that contains
# the OAuth 2.0 information for this application, including its client_id and
# client_secret.
CLIENT_SECRETS_FILE = "client_secret.json"
# This OAuth 2.0 access scope allows for full read/write access to the
# authenticated user's account and requires requests to use an SSL connection.
SCOPES = ['https://www.googleapis.com/auth/youtube.force-ssl ']
API_SERVICE_NAME = 'youtube'
API_VERSION = 'v3'
def get_authenticated_service():
flow = InstalledAppFlow.from_client_secrets_file(CLIENT_SECRETS_FILE, SCOPES)
credentials = flow.run_console()
return build(API_SERVICE_NAME, API_VERSION, credentials = credentials)
def print_response(response):
print(response)
# Build a resource based on a list of properties given as key-value pairs.
# Leave properties with empty values out of the inserted resource.
def build_resource(properties):
resource = {}
for p in properties:
# Given a key like "snippet.title", split into "snippet" and "title", where
# "snippet" will be an object and "title" will be a property in that object.
prop_array = p.split('.')
ref = resource
for pa in range(0, len(prop_array)):
is_array = False
key = prop_array[pa]
# For properties that have array values, convert a name like
# "snippet.tags[]" to snippet.tags, and set a flag to handle
# the value as an array.
if key[-2:] == '[]':
key = key[0:len(key)-2:]
is_array = True
if pa == (len(prop_array) - 1):
# Leave properties without values out of inserted resource.
if properties[p]:
if is_array:
ref[key] = properties[p].split(',')
else:
ref[key] = properties[p]
elif key not in ref:
# For example, the property is "snippet.title", but the resource does
# not yet have a "snippet" object. Create the snippet object here.
# Setting "ref = ref[key]" means that in the next time through the
# "for pa in range ..." loop, we will be setting a property in the
# resource's "snippet" object.
ref[key] = {}
ref = ref[key]
else:
# For example, the property is "snippet.description", and the resource
# already has a "snippet" object.
ref = ref[key]
return resource
# Remove keyword arguments that are not set
def remove_empty_kwargs(**kwargs):
good_kwargs = {}
if kwargs is not None:
for key, value in kwargs.iteritems():
if value:
good_kwargs[key] = value
return good_kwargs
def search_list_by_keyword(client, **kwargs):
# See full sample for function
kwargs = remove_empty_kwargs(**kwargs)
response = client.search().list(
**kwargs
).execute()
return print_response(response)
if __name__ == '__main__':
# When running locally, disable OAuthlib's HTTPs verification. When
# running in production *do not* leave this option enabled.
os.environ['OAUTHLIB_INSECURE_TRANSPORT'] = '1'
client = get_authenticated_service()
search_list_by_keyword(client,
part='snippet',
maxResults=50,
q='казакстан,политика',
type='video,channel,playlist')
# -*- coding: utf-8 -*-
import os
import google.oauth2.credentials
import google_auth_oauthlib.flow
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
from google_auth_oauthlib.flow import InstalledAppFlow
# The CLIENT_SECRETS_FILE variable specifies the name of a file that contains
# the OAuth 2.0 information for this application, including its client_id and
# client_secret.
CLIENT_SECRETS_FILE = "client_secret.json"
# This OAuth 2.0 access scope allows for full read/write access to the
# authenticated user's account and requires requests to use an SSL connection.
SCOPES = ['
API_SERVICE_NAME = 'youtube'
API_VERSION = 'v3'
def get_authenticated_service():
flow = InstalledAppFlow.from_client_secrets_file(CLIENT_SECRETS_FILE, SCOPES)
credentials = flow.run_console()
return build(API_SERVICE_NAME, API_VERSION, credentials = credentials)
def print_response(response):
print(response)
# Build a resource based on a list of properties given as key-value pairs.
# Leave properties with empty values out of the inserted resource.
def build_resource(properties):
resource = {}
for p in properties:
# Given a key like "snippet.title", split into "snippet" and "title", where
# "snippet" will be an object and "title" will be a property in that object.
prop_array = p.split('.')
ref = resource
for pa in range(0, len(prop_array)):
is_array = False
key = prop_array[pa]
# For properties that have array values, convert a name like
# "snippet.tags[]" to snippet.tags, and set a flag to handle
# the value as an array.
if key[-2:] == '[]':
key = key[0:len(key)-2:]
is_array = True
if pa == (len(prop_array) - 1):
# Leave properties without values out of inserted resource.
if properties[p]:
if is_array:
ref[key] = properties[p].split(',')
else:
ref[key] = properties[p]
elif key not in ref:
# For example, the property is "snippet.title", but the resource does
# not yet have a "snippet" object. Create the snippet object here.
# Setting "ref = ref[key]" means that in the next time through the
# "for pa in range ..." loop, we will be setting a property in the
# resource's "snippet" object.
ref[key] = {}
ref = ref[key]
else:
# For example, the property is "snippet.description", and the resource
# already has a "snippet" object.
ref = ref[key]
return resource
# Remove keyword arguments that are not set
def remove_empty_kwargs(**kwargs):
good_kwargs = {}
if kwargs is not None:
for key, value in kwargs.iteritems():
if value:
good_kwargs[key] = value
return good_kwargs
def search_list_by_keyword(client, **kwargs):
# See full sample for function
kwargs = remove_empty_kwargs(**kwargs)
response = client.search().list(
**kwargs
).execute()
return print_response(response)
if __name__ == '__main__':
# When running locally, disable OAuthlib's HTTPs verification. When
# running in production *do not* leave this option enabled.
os.environ['OAUTHLIB_INSECURE_TRANSPORT'] = '1'
client = get_authenticated_service()
search_list_by_keyword(client,
part='snippet',
maxResults=50,
q='казакстан,политика',
type='video,channel,playlist')
ha...@gmail.com <ha...@gmail.com> #42
6 years after this issue opened, the problem is not fixed ?
I can get have ~550 videos from one channel :( not more !!
I can get have ~550 videos from one channel :( not more !!
[Deleted User] <[Deleted User]> #43
Yup I'm also getting less than 500 records. After using next_page_token also.
There is no solution still to increase the limit of getting more records?
There is no solution still to increase the limit of getting more records?
[Deleted User] <[Deleted User]> #44
publishedAfter and publishedBefore = tons of bugs ! I added time from 05 10 2019 to 05 11 2019 and there are older (199x-2019) or newer videos in results ! All this api is so bad... So we can get only 500 results.
[Deleted User] <[Deleted User]> #45
Google you should add more advanced search options to youtube. I want search all channels ordered by subscriptions ! We can't find what we want ! I thought i could create application using your api but no way, api is broken (publishedAfter and publishedBefore).
gc...@gmail.com <gc...@gmail.com> #46
Ok, I'm trying to fetch list of my subscribers for my channel and getting just 1000 items (nextPageToken is missing on a 20th page of 50 maxResults per page) - So how is getting ALL my subscribers not relevant? I mean, I'm NOT SEARCHING, I just need a full list of my subscribers, not just 1000, but all 200.000. Can you please help/explain. Cheers.
[Deleted User] <[Deleted User]> #47
Are there any changes, with the YouTube API and the result limit?
la...@gmail.com <la...@gmail.com> #48
Is there anyone have code about how to fetch more than 50+ video results. Please help me.
un...@gmail.com <un...@gmail.com> #49
holy fucking shit; google is garbage. i want to know when my subscribers subscribed so i can see what videos are doing better than the others; and what causes a subscriber to sub. this is some basic common sense shit. how the hell are you preventing people from getting a list to their own subscribers? why does it stop at 1000 records; ie 20 50 maxresults pages? what dumb fuck came up with this brilliant idea? fucking crooks
how the hell do the analytics sites like tubular labs do it? im guessing they're stuffing the pockets of google with under the table deals to get more than the 50 maxresults page bullshit.
how the hell do the analytics sites like tubular labs do it? im guessing they're stuffing the pockets of google with under the table deals to get more than the 50 maxresults page bullshit.
ar...@gmail.com <ar...@gmail.com> #50
The limit of active live streams is just 100. After second request with maxresult=50, the 'items' part of response is just empty, no more records about live streams.
bf...@gmail.com <bf...@gmail.com> #51
Throughout this thread, there is the recommendation of using a limited time Video to get around that limit. But I can't find a way to do that or any other factor that would give you an option to narrow results down to 500 but procedurally get all. Am I missing an option or is there none?
qr...@gmail.com <qr...@gmail.com> #52
<?php
$channel_id = "UCdOA_1KzXHIp3gmVyyYv2sg"; subscribe please!
$api_key = "";
$api_response = file_get_contents('
$someArray = json_decode($api_response, true);
for($i = 0; $i < count($someArray['items']); $i++) {
echo "<p>";
echo $someArray['items'][$i]['snippet']['title'];
echo "</p>";
}
$pagina = json_encode($someArray['nextPageToken']) ; //CAIQAA
if ($pagina !== Null )
?>
in...@gmail.com <in...@gmail.com> #53
Not fixed. Still happening 10 years later.
Yesterday, I requested a paginated search on a channel with >1100 videos and got back 504 videos (excluding channel Id and playlists).
Today I just ran it a few minutes ago and got 391 videos.
In both cases, same channel Id and total videos is specified as 1172.
How do I get all 1172?
Yesterday, I requested a paginated search on a channel with >1100 videos and got back 504 videos (excluding channel Id and playlists).
Today I just ran it a few minutes ago and got 391 videos.
In both cases, same channel Id and total videos is specified as 1172.
How do I get all 1172?
hs...@gmail.com <hs...@gmail.com> #54
+1. Still happening after 10 years.
gr...@googlemail.com <gr...@googlemail.com> #55
same here, why?
ja...@me.com <ja...@me.com> #56
The first API response/page I get indicates that there are 601 total search results with 3 results on the first page, but it doesn't seem I'm being given a nextPageToken to access the next results/pages, does anyone know why?
ja...@me.com <ja...@me.com> #57
update to my previous comment, I got rid of all of the optional parameters in making the request for the list and left only the part and query and it worked, I don't know if that's related
pb...@gmail.com <pb...@gmail.com> #58
I'm trying to get 1,741 comments on a video and the API gives me 326 of them at which time there is no nextPageToken returned.
ja...@me.com <ja...@me.com> #59
I think it's possible to get more results if you limit the time frame of your search with publishedbefore and publishedafter, you'd get the maximum within these then increase the time frame from where you ended up until you get the results across all time frames
vi...@gmail.com <vi...@gmail.com> #60
In my search result, using YouTube search v3, there is only 8 video id's but total result shows 462. Even I don't get the nextPage Token.
"pageInfo": {
"totalResults": 462,
"resultsPerPage": 8}
"pageInfo": {
"totalResults": 462,
"resultsPerPage": 8}
Description
NOTE: Use this form ONLY if you are a DEVELOPER reporting bugs about the
YouTube Data or Player APIs. If you are experiencing problems with the
YouTube web site or another YouTube playback mechanism, please use
------------------------------------------------------------------------
Name of API affected:
Issue summary:
Provide a brief summary of the issue you're experiencing.
Steps to reproduce issue:
1. Do a query search, max-results 50 for example :
2. When the count > 1000 results, try to get all results :
- by start-index with v2
- by nextPageToken with v3
3.
-v3 search stop between 10 and 15 pages (around 500 results) : no nextPageToken return
-v2 search return error 400 or error 500. And some strange results for example page 1 => 50 results, page 4 => 38 results, page 5 => 50 results, etc...
Expected output:
What do you expect to see after performing the above steps?
- v3: nextPageToken until 1000 limit of youtube
- v2: start-index with 20 page (50 max-results : 50*20 = 1000), and each page full (50 results when 50 max-results)
Thanks