SciELO - Scientific Electronic Library Online

vol.30 número2Observatorios: clasificación y concepción en el contexto iberoamericanoInstitution profile wizard: una solución para la normalización en Scopus índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados




  • No hay articulos citadosCitado por SciELO

Links relacionados

  • No hay articulos similaresSimilares en SciELO


Revista Cubana de Información en Ciencias de la Salud

versión On-line ISSN 2307-2113

Rev. cuba. inf. cienc. salud vol.30 no.2 La Habana abr.-jun. 2019  Epub 01-Jun-2019



Lack of standards in evaluating YouTube health videos

Carencia de normas en la evaluación de videos de salud de YouTube

1 Universitat de Barcelona. Departament de Biblioteconomia, Documentació i Comunicació Audiovisual & Centre de Recerca en Informació, Comunicació i Cultura. Barcelona, España.


This paper is a systematised literature review of YouTube research in health with the aim of identify the different keyword search strategies, retrieval strategies and scoring systems to assess video content. A total of 176 peer-reviewed papers about video content analysis and video evaluation were extracted from the PubMed database. Concerning keyword search strategy, 16 papers (9.09 %) reported that search terms were obtained from tools like Google Trends or other sources. In just one paper, a librarian was included in the research team. Manual retrieval is a common technique, and just four studies (2.27 %) reported using a different methodology. Manual retrieval also produces YouTube algorithm dependencies and consequently obtains biased results. Most other methodologies to analyse video content are based on written medical guidelines instead of video because a standard methodology is lacking. For several reasons, reliability cannot be verified. In addition, because studies cannot be repeated, the results cannot be verified and compared. This paper reports some guidelines to improve research on YouTube, including guidelines to avoid YouTube dependencies and scoring system issues.

Key words: Social networking; review literature; video-audio media; video recording; instructional films and videos; audiovisual recording; instruction; audiovisual aids; YouTube


El estudio es una revisión bibliográfica sistematizada de investigaciones de YouTube sobre salud, realizado con el objetivo de identificar las diferentes estrategias de búsqueda de palabras clave, estrategias de recuperación y sistemas de puntuación utilizados en la evaluación de los contenidos de los videos. Se extrajo un total de 176 artículos arbitrados sobre análisis de contenidos de videos y evaluación de videos publicados en la base de datos PubMed. En cuanto a la estrategia de búsqueda de palabras clave, 16 artículos (9,09 %) refieren que los términos de búsqueda se obtuvieron en herramientas como Google Trends y otras fuentes. Un solo artículo incluye un bibliotecario en el equipo de investigación. La recuperación manual es una técnica frecuente, y solo cuatro estudios (2,27 %) refieren haber utilizado otra metodología. La recuperación manual también crea dependencias de los algoritmos de YouTube con la consecuente obtención de resultados sesgados. La mayoría de las otras metodologías de análisis de contenidos de videos se basan en guías médicas impresas y no en videos, por la carencia de una metodología estándar. Por diversas razones, la confiabilidad no se puede verificar. Además, teniendo en cuenta que los estudios no pueden repetirse, los resultados no se pueden verificar ni comparar. El presente artículo ofrece algunas directrices para mejorar las búsquedas en YouTube, entre las que se incluyen directrices para evitar la dependencia de YouTube y temas relacionados con los sistemas de puntuación.

Palabras clave: Redes sociales; revisión bibliográfica; medios de video y audio; grabación de video; filmes y videos instructivos; grabación audiovisual; instrucción; medios audiovisuales; YouTube


Video content from the social network site YouTube are a primary source of information in the health field. As a social network site, from its founding in 2005, any registered user can publish videos in YouTube when the community rules are satisfied.1,2,3.4,5) Published videos on YouTube have a lack of publishing control, and there is a lack of control about the type of content being published. There are controls about copyrighted videos or sound but not about the quality of the content or the type of content itself. Moreover, video keywords and metadata have an uncontrolled vocabulary. Retrieving content requires one to consider social tagging because of that uncontrolled vocabulary.

In the scientific literature of the health field, the information quality inside the videos is analysed in a very different way, as is the interaction with the videos. In analysing quality, not much attention is given to the retrieval and search design strategies. Moreover, different methodologies exist on analysing video content. On one hand, there can be quantitative analysis into the video content evaluation where it is possible to find a wide range of methodologies. On the other hand, there are qualitative analysis studies using sentiment analysis techniques based on comment analysis or interaction with the videos of the platform.

There are different issues regarding video consumption in YouTube. One issue is user behaviour in searching for videos. Searches done by patients are very different from those done by physicians with health knowledge.2 Therefore, social tagging makes video retrieval difficult with the consequence of retrieving biased or misleading information.

However, there are several reasons to publish videos on YouTube in the health field. For example, videos are published to provide resources for instructors in health education,3 additional information to educate health students or educate patients to do some preventive actions4) or to report about certain diseases.5) Also, videos are published to provide training for future physicians in surgical techniques6 even when space to teach is reduced.

Other literature reviews suggest that video searching should use a snowball technique,7 or they point out that video quality on YouTube to educate patients or future physicians is underdeveloped.8) There is a need to develop better algorithms that provide better retrieval results,9) and there are not big differences between video content analysis methods.10) In this literature review, two questions that consistently appear in the scientific literature are faced.

The first question concerns problems with search design and video retrieval, as no controlled vocabulary exists on YouTube. This question generates many issues on retrieval of information and content reliability. YouTube retrieval methodologies are diverse. First, it is possible to retrieve videos through manual search. This fact produces algorithm dependencies with biased retrieval. Second, it is possible to use the YouTube Application Program Interface (API) to retrieve videos, where dependencies on the algorithm are eliminated from search strategies. Finally, it is possible to use data mining strategies to retrieve videos and their metadata. When these methodologies are used, not only must algorithm dependencies be considered but social tagging as well. Because YouTube permits its users the use of an uncontrolled vocabulary, this generates biased video retrieval.

The second question concerns different methodologies that analyse the quality of videos with a quantitative approach. These methodologies are not standardised and are diverse. This can cause future researchers’ confusion about methodology with the possibility of obtaining biased results.

In scientific literature different methodologies are reported related to information quality analysis for YouTube videos related to the health field. The result of these methodologies permits to decide whether quality information contained in videos can serve in patient and medical students’ education about diseases or surgical techniques. These methodologies also serve to decide whether videos are suitable to be used by health professionals and students. Additionally, is possible to find different strategies on video retrieval used by health researchers. These strategies include the information search design and strategies related to keyword selection. The goal of this paper is to study these procedures found in the scientific literature.

This paper is a systematised literature review of YouTube research in health with the aim of identify the different keyword search strategies, retrieval strategies and scoring systems to assess video content.


This research involves a systematic literature review. In this study, 176a peer-reviewed papers were analyzed (11) using descriptive statistics and qualitative paper analysis. Frequencies and percentages were computed for the outcome variable. In May 2018, a search was run in the PubMed database using the search phrase “(YouTube[Title] and (Educational Measurement[MeSH Terms] or Video Recording[MeSH Terms] OR Information Dissemination[MeSH Terms] or Social Media[MeSH Terms] or YouTube[Other Term])”. This search provided 306 papers. These papers were downloaded into a Microsoft Excel 2017 spreadsheet. From all 306 retrieved papers, the abstract and the title were read.

The inclusion criteria related to the papers read were as follows. First, the studies included in this review were articles written from 2005 to April 2018. Second, the language of the papers was in English, French, German, Portuguese and Spanish. Third, another inclusion criterion was papers that used quality information analysis methodologies to evaluate videos with a quantitative approach.

The exclusion criteria were related to papers that described qualitative analysis like sentiment analysis, YouTube comment analysis papers or editor letters. These papers were excluded because they did not evaluate information quality but user interaction with videos and they were finally deleted from the spreadsheet.

From the included papers, all articles were completely read, recording different items into the spreadsheet. These items were video search design, inclusion criteria methodology related to videos, retrieved metadata and different systems of quality information analysis contained in the videos.

Regarding video search design keyword selection, the use of any technological tool to select key words or search phrases and information retrieval strategies were also included. Related to inclusion criteria methodology, different parameters such as length, number of visualisations and language were also recorded. All variables were recorded into the spreadsheet for further analysis with descriptive statistics.


As YouTube was created in 2005, the temporary line of the analysed papers goes from 2005 to April 2018. As shown by the number of articles published (Fig.), YouTube is increasingly becoming a primary source of analysis, and it is likely that there will be more video content assessment methods in future research. Therefore, it is necessary to establish homogeneous and standardised content assessment systems. Furthermore, for comparison and repetition of research, papers should provide a complete list of analysed videos. Of all reviewed papers, only 19 (10.79 %) provided such a list.

Number of selected articles

Fig-  Years of publication of selected articles. 

Keyword selection strategy

One drawback in using a social network site like YouTube is that both the user and the video creator can use non-conventional or social tagging instead of a controlled vocabulary. This permits the video creator to freely tag their videos, but also the user can use free vocabulary to retrieve videos. The absence of a controlled vocabulary can cause biases in information retrieval, and therefore searches might not find relevant information or might miss some necessary information.

Because of this type of structure, it is necessary to use different strategies to find appropriate key words that permit searchers to recover relevant and persistent information. Using keyword search tools such as Google Keyword, Google Trends or the autocompletion function (in which YouTube suggests keywords used by users) is enough for a layperson with no research needs to be satisfied with the information search. Therefore, it is assumed that there is no need for a librarian in a research team to design an information search strategy in social networks like YouTube. To understand how information is organised, improve the use of these tools and save research costs, however, it is necessary to have a research-embedded librarian as part of a research team.

Having a research-embedded librarian has many advantages for an expert search of information in specialised databases such as MEDLINE or SCOPUS12 as well as in social media. Only one article (0.56 %) reported the inclusion of a Master’s prepared librarian in the research team.13

Any information search uses a strategy of keyword searches to obtain information. In 157 articles (89.20 %), however, there was no indication whether the keywords had been previously selected or whether they corresponded to established criteria in the research design.

There are different trends in the use of technological tools for keyword selection. The use of Google Trends or Google Insights (n= 12, 6.81 %) was one of the preferred tools to select keywords on YouTube. Currently, Google Trends and Google Insights are the same tool. Google Trends allows the user to graphically analyse the search tendency of a concrete keyword or a set of keywords through time. Other search strategies include a combined keyword search strategy (n= 4, 2.27 %) using newspapers, scientific literature or medical books.14

One of the combined strategies was using together the Google AdWords keyword tool and Mesh (Medical Subject Headings).15 In other cases, other strategies were grouped by the terms selected by patients and students in health (n = 2, 1.13 %).

Two papers with Portuguese authors (1.13 %) used medical descriptors as keywords in searches with DeCS (Health Sciences Descriptors).16-17 Another strategy was to perform several iterations of YouTube searches with different keywords before choosing the definitive keywords (n= 1, 0.56 %).

Finally, other options included using the YouTube autocomplete option in the search box or looking for YouTube channels of trustworthy healthcare organisations such as the Red Cross.18 In addition, some papers noted that videos do not have subject headlines that would be helpful in searching for terms in academic databases.13 No evidence of variation in the terms in videos’ metadata could be found.

Information retrieval strategies

One of the great constraints on video retrieval is dependency on the YouTube algorithm. The algorithm shows results depending on many variables in searches such as the number of views and “like” or “dislike” votes on the videos. Although vocabulary in YouTube is uncontrolled, researchers can obtain large enough video samples to perform content-analysis validations.

Three information retrieval strategies were found in this review. The first was manual retrieval (n= 172, 98.29 %) wherein five cases (2.84 %) implemented the snowball technique.19 The snowball technique on YouTube is done following the algorithm video recommendations20 or by retrieving videos that have not appeared by searching for usual keywords.21 In this way, through the algorithm recommendations, videos are re-extracted until the recommendations are repeated in search results. This means that there is a dependency on the recommendations of the algorithm to recover, visualize and analyse following videos.4

The second technique was the use of the YouTube Application Programming Interface (API; n= 3, 1.70 %). A programmer is needed to use the YouTube API, but doing so enables the user to automate content recovery without depending on the algorithm. It is not possible to use the snowball technique with the YouTube API, but it is possible to download video metadata such as the title, tags, descriptions or URL.22 The YouTube API permits access to the channel metrics if the publisher grants permission through the API.

The third information-retrieval technique found was the use of TOR Network (n= 1, 0.56 %) to avoid dependencies on the YouTube algorithm and anonymous browsing.23Anonymous searches were also done by eliminating cookies, using a new computer or browsing using the browser’s incognito mode. This technique excludes any personalisation effects, lead bias and problems with repeatability.

Selection of videos and data collection

YouTube allows the use of sorting and filtering results once the search phrase is entered. This filter is divided into date of ascent, type, duration time characteristics, sorting and video duration.24 Sorting by relevance was the option most often used (n= 79, 44.88 %). Sorting by views was second (n= 33, 18.75 %), but it is appropriate only if the number of views is high enough (i.e. 1,000 views). Researchers also sorted by combining views and relevance (n= 25, 14.20 %) to avoid biased results and to obtain the greatest number of videos.25 Other authors used another type of sorting technique (n= 8, 4.54 %). Finally, other authors didn’t use any ordering method (n= 32, 18.18 %).

Another topic discussed is the variety of videos under the inclusion and exclusion criteria of duration time and views. Twenty-three studies (n= 23, 13.06 %) considered video duration time or number of views (table 1). Three studies chose a duration time of less than 4 minutes17,26,27) while six studies (3.40 %) reported that the selected videos were less than 10 minutes long.

There is an exception in table 1 where one study included videos of less than 7 minutes and 150 views or less. The duration of videos with medical content predominates videos between 10 and 20 minutes.

Table 1-  Inclusion criteria in minutes in selected papers 

* NR= No restrictions.

Regarding inclusion or exclusion criteria, there is no exact criterion that permits the definition of a parameter to choose videos with a particular number of views. This parameter is very scattered. As shown in table 2, the number of views as a criterion is shown in just six studies (3.40 %). In fact, there is no criterion of certainty that indicates how many views were required for a video to be included in a study. There are many parameters in which the views were counted because YouTube counts the quality views, but there are no established criteria for how a view is counted regarding viewing time. This means that it might make sense to include videos with fewer views in any research study.

Table 2-  Selection of videos by number of views 

Language was another criterion of inclusion or exclusion. The main language of choice for the videos was English, and there is little variety in the analysis of videos in other languages. Nineteen studies (10.79 %) chose a different language such as Finnish, Greek,28 Mongolian,29 Korean30 and Italian.31 Because working teams were international, there were a combination of languages (table 3). Fourteen studies (7.95 %) did not indicate in which language the videos were published, but we can infer that it was English because of other information found in the paper. These data indicate that there is a need for content analysis in languages other than English.

Table 3-  Inclusion languages in selected papers 

Regarding the number of analysed videos, the average was 153.09 with a median of 89. Therefore, it is possible to approximate that a video evaluation analysis ideally should have a sample of 90 videos. Future research analysis should validate this number.

An issue requiring further discussion is the number of search result pages from which videos are collected. Some authors indicated that users usually look at the first three pages;32 others say Google users look at the first page results,33) and others analysed the first 10 pages. Four studies (2.2 %) stated that the result position was saved. This option it does not seem to be a good idea because results could change in the next search using the same term.

YouTube metadata

YouTube, like other social network sites, captures different metadata and public metrics. Metadata has varied since YouTube was created, and each study captured different metadata, depending on what was evaluated. Table 4 presents the metadata and metrics that can be captured on YouTube.

Square-  Visible metrics and metadata available in YouTube 

It is possible to retrieve key words from the videos, but only by using third-party software. Just one study retrieved keywords from videos.15) Another series of data that can be captured initially is the original language of the video, but language is a criterion for inclusion or exclusion in the reviewed studies. In addition, it is possible to publish videos on YouTube with titles and descriptions in a language different from the original language in the video; for example, one can publish a video in English with the title and description in German. Therefore, the same video can appear in search results with the title and description in multiple languages.

Additional data can be captured in the “About” page for the uploader’s channel including the total number of channel displays, date of the channel’s creation and the channel’s country of origin. Therefore, a search for a channel name in the YouTube search engine reveals the number of published videos and the playlist created. Playlists are interesting to consider because they appear in the search results and usually have similar content. Only one article reported considering the playlist,34 but only as an exclusion criterion. Table 5 presents the captured metadata from all articles.

Table 4-  Metadata collected from YouTube channels 

Content quality evaluation methodologies

The literature mentions many video quality information evaluation methodologies; these often have been adapted from written information evaluations. To evaluate a certain medical technique, it is possible to use a specific standard or guide. To assess video content quality, however, there are different aspects that should be also considered in future research.

In video content analysis, it is necessary to not only assess the quality of the information but also the technical quality of the video. Online video content analysis should evaluate the sound, resolution and quality of the video as a whole or frame by frame. There are no standards or consistency in the methods that facilitate this assessment, however. Methods exist to evaluate the quality of medical information, but a standardized system of video and pedagogical quality must be defined for future research.

An element that seems unclear is the definition of an instructional video versus an educational video. An instructional video is one that shows you how to do something or what to do in certain situations.35 An educational video into the health field is one that patients or doctors view to learn about a certain aspect (e.g. pain, diseases, type of surgery). Borders between instructional videos and educational videos are too weak, however. Forty-nine studies (27.84 %) scored educational videos, and one (0.56 %) scored instructional videos. The rest of the reviewed articles evaluated issues such as information accuracy or YouTube as a source of information.

In addition, given the social scope of YouTube, defining professional-style videos as those with commercial gain intent or amateur-style videos as short homemade ones is not a good approach.36 Current technology enables people to create professional-looking videos, making it difficult to differentiate professional and amateur videos. This might not be true in the health arena, but it is true in other disciplines.

Different ways exist to evaluate the quality content of the videos, with 62 papers (35.22 %) in the sample using different scoring methods to assess videos in terms of the quality of information and some also assessing technical quality. These scoring methods, however, were adapted from guidelines that assess written information, as there is no standard of evaluation.

The most used methods were Global Quality Systems (GQS; n= 14, 7.95 %), the Medical Video Rating System (MVRS; n= 3, 1.70 %) and the Suitability Assessment of Material (SAM; n= 1, 0.56 %).

GQS uses a 5-point Likert scale to evaluate quality information. It does not seem to be a good indicator; however, as it this system is designed to evaluate the quality of information in websites.32) The MVRS system is divided into three parts: technical quality (light, sound, resolution, angle and duration), diagnostic accuracy and efficacy as a clinical example.36,37) SAM enables the user to evaluate the quality and materials of the videos.18

In addition, other authors have created their own systems to evaluate video information quality. For instance, different authors proposed a scoring system related to the audio and technical quality of the video as well as a score from the educational point of view on audio teaching quality with scores of 0, 0.5 and 1.14,38

Other authors propose a system to evaluate educational videos that examines targeting content, technical, authority and pedagogy parameters with scores from 0 to 2 without half scores.39 In addition, one group of authors used their own system to evaluate videos but did not consider the technical qualities of the videos. On the other hand, researchers adapted HONcode to evaluate the video information quality.40 Another type of assessment was used to evaluate the quality of information as useful or misleading, indicating the presence or absence of information in the video.41,42,43,44

Finally, some authors reported that the video evaluation systems are inadequate, noting issues such as the absence of standards for video evaluation,43,45 absence of indicators and tools to evaluate information on YouTube videos,46 and absence of tools to assess the quality of the videos in health information.47 In addition, quality indicators in patient education videos on YouTube are inconsistently adopted.48

Regarding YouTube as a source of accurate information, 86 studies (48.86 %) considered it to be a poor source of medical information based on their research topic. They reported that information they were seeking for further analysis was inaccurate or of poor quality. They also reported that information was lost or not catered to according to appropriate quality requirements and that, in a qualitative way, the videos did not provide appropriate information.

Therefore, 49 (27.84 %) papers reported that YouTube is a good source of information and pointed out that information within the videos was accurate or appropriate at an instructional or educational level. The remainder of the articles (n= 42, 23.86 %) either did not report on information accuracy or assessed other issues.

Among the reports on the accuracy of information, comments about the quality and information control indicated that the content of the videos was incorrect, reflected issues that were “not relevant”,49 and was “unsuitable as an educational tool” (16) or “difficult to verify authenticity”. (50

On the other hand, studies indicated that the content found on the videos were “content useful”.2 It must be noted, however, that these reports are on different subjects and specialties in health. Therefore, depending on the specialty from which they were analysed, the videos could be seen as accurate, of good quality and useful - or as inaccurate.


Scientific studies must be described so that they can be repeated and the results can be replicated. In the important area of health information videos on YouTube, several issues lead to a situation in which the quality of the research remains low. In most cases, the studies cannot be repeated as the complete list of videos is not provided.

As a social network site, YouTube presents different questions to research like video uploading frequencies and social tagging. Yet researchers have tackled keywords, search strategies and the variety of evaluation methods for content analysis and algorithm dependence.51 Therefore, there is a lack of information competence in part of the study designs which is reflected in the reviewed literature. Moreover, there are some incomplete descriptions of study design, which is an area on which to focus.

Although YouTube is a very good primary information source, uploading frequencies and deleting frequencies is an issue to be faced. During research, new videos might appear, and existing ones might disappear, affecting search results. Thus, unless videos are downloaded during the retrieval phase, some information can be gained while other information is lost.

Regarding information competence, authors are unaware of the need to describe exactly the search terms and parameters to describe their results.52) They are also unaware of the personalisation and dynamics of the recommendation algorithm that leads to a situation in which YouTube delivers very different results for the same query, even at the same time. This means that, in most cases, the analyses cannot be repeated.

According to the review, having a research librarian on the research team to design the keyword search strategies and retrieval methodologies could improve research.12 Also, researchers should consider using data mining tools to download videos and their metadata or anonymous browsing to retrieve videos; both would help avoid dependency on YouTube algorithms, which is crucial for the possibility of repeating an analysis.34

Regarding content analysis and related to the data obtained in the review, it seems that research in content analysis is trustworthy when analysing 90 videos with a duration of 10-20 minutes. More research is needed in this regard, however, as specialists in the same field can find different results.

There are some limitations in this study. First, the papers analysed were about quality information in YouTube video, and it did not review articles with sentiment analysis. Although it was a large sample compared to other reviews, is likely that other articles were missed. Second, the PubMed database can find papers that are not indexed by MeSH descriptors or keywords. These kinds of articles index the author, paper title or DOI. Because of that, is also likely that other evaluation methodologies did not appear in the search results. Third, educational measurement descriptor in MeSH refers to the assessing of academic or educational achievement and video is excluded as a learning resource into teaching strategies.

There are many evaluation methodologies in quality information content analysis. It seems necessary to create a homogeneous and standardized video evaluation system in a consensual manner, or the different methodologies could lead to different results even in the same area of health. Also, quality is a subjective term that should be quantified in a video evaluation system, and issues such as sound or clarity of images do not appear in the literature when evaluating a video. In future work, a video evaluation system will be proposed to address these issues.

Finally, special attention to keyword search strategy, retrieval methodologies and avoiding algorithm dependencies should be considered in future research.


YouTube is a primary source of information for analysis, but as stated in this literature review, there are many issues to be solved. An online video evaluation system is necessary. It should be standardised with the right indicators. Although the presence of metadata has been used to evaluate videos, other indicators such as sound quality, image quality or information accuracy is needed in a field like health. Indicators on the number of views are also required, but this is the least reliable parameter.

It is also necessary to generate specific websites in health where content is controlled, for example by a librarian as in the case of, to avoid incorrect publications with elements such as the absence of information or questions such as clickbait. Not only will this benefit research, but it will make issues such as controlled vocabulary in the retrieval of information less of a stumbling block.


1. Forsyth SR, Malone RE. 'I'll be your cigarette--light me up and get on with it': examining smoking imagery on YouTube. Nicotine Tob Res Off J Soc Res Nicotine Tob. 2010;12(8):810-6. 2. Fernández-Llatas C, Traver V, Borras-Morell JE, Martínez-Millana A, Karlsen R. Are health videos from hospitals, health organizations, and active users available to health consumers? An Analysis of Diabetes Health Video Ranking in YouTube. Comput Math Methods Med. 2017;2017:8194-940. [ Links ]

2. Fernández-Llatas C, Traver V, Borras-Morell JE, Martínez-Millana A, Karlsen R. Are health videos from hospitals, health organizations, and active users available to health consumers? An Analysis of Diabetes Health Video Ranking in YouTube. Comput Math Methods Med. 2017;2017:8194-940. [ Links ]

3. Burke S, Snyder S, Rager R. An assessment of faculty usage of YouTube as a teaching resource. Internet J Allied Health Sci Pract. 2009 [access: 2018/12//27]1;7(1). Available from: ]

4. Sharma R, Lucas M, Ford P, Meurk C, Gartner CE. YouTube as a source of quit smoking information for people living with mental illness. Tob Control. 2016;25(6):634-7. [ Links ]

5. Backinger CL, Pilsner AM, Augustson EM, Frydl A, Phillips T, Rowden J. YouTube as a source of quitting smoking information. Tob Control [Internet]. 2011 [access: 2018/12//27];20(2):119-22. Available from: ]

6. Schmidt RS, Shi LL, Sethna A. Use of streaming media (YouTube) as an educational tool for surgeons-a survey of AAFPRS members. JAMA Facial Plast Surg [Internet]. 2016 [access: 2018/12//27];18(3):230-1. Available from: ]

7. Sampson M, Cumber J, Li C, Pound CM, Fuller A, Harrison D. A systematic review of methods for studying consumer health YouTube videos, with implications for systematic reviews. Peer J [Internet]. 2013 [access: 2018/12//27];1:e147. Available from: ]

8. Gabarron E, Fernández-Luque L, Armayones M, Lau AY. Identifying measures used for assessing quality of YouTube videos with patient health information: A review of current literature. Interact J Med Res. 2013;2(1):e6. [ Links ]

9. Madathil KC, Rivera-Rodriguez AJ, Greenstein JS, Gramopadhye AK. Healthcare information on YouTube: A systematic review. Health Informatics J. 2015(3):173-94. [ Links ]

10. Drozd B, Couvillon E, Suárez A. Medical YouTube videos and methods of evaluation: literature review. JMIR Med Educ. 2018;4(1):e3. [ Links ]

11. Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Inf Libr J. 2009;26(2):91-108. [ Links ]

12. Brahmi FA, Kaplan FTD. Embedded librarian as research team member. J Hand Surg. 2017;42(3):210-2. [ Links ]

13. Owens JK, Warner Stidham A, Owens EL. Disaster evacuation for persons with special needs: a content analysis of information on YouTube. Appl Nurs Res ANR. 2013;26(4):273-5. [ Links ]

14. Sunderland N, Camm CF, Glover K, Watts A, Warwick G. A quality assessment of respiratory auscultation material on YouTube. Clin Med Lond Engl. 2014;14(4):391-5. [ Links ]

15. Williams D, Sullivan SJ, Schneiders AG, Ahmed OH, Lee H, Balasundaram AP, et al. Big hits on the small screen: an evaluation of concussion-related videos on YouTube. Br J Sports Med. 2014;48(2):107-11. [ Links ]

16. Barreto Tavares Chiavone F, de Lima Ferreira L, Tuani Candido de Oliveira Salvador P, Filgueira Martins Rodrigues CC, Yasmin Andrade Alves K, Pereira Santos VE. Analysis of YouTube videos about urinary catheterization technique of male delay. Investig Educ En Enfermeria. 2016;34(1):171-9. [ Links ]

17. Tourinho FSV, de Medeiros KS, Salvador PTCDO, Castro GLT, Santos VEP. Analysis of the YouTube videos on basic life support and cardiopulmonary resuscitation. Rev Col Bras Cir. 2012;39(4):335-9. [ Links ]

18. Dubey D, Amritphale A, Sawhney A, Dubey D, Srivastav N. Analysis of YouTube as a source of information for West Nile Virus infection. Clin Med Res. 2014;12(3-4):129-32. [ Links ]

19. Nasim A, Blank MD, Cobb CO, Berry BM, Kennedy MG, Eissenberg T. How to freak a Black & Mild: a multi-study analysis of YouTube videos illustrating cigar product modification. Health Educ Res. 2014;29(1):41-57. [ Links ]

20. Bueno M, Nishi ÉT, Costa T, Freire LM, Harrison D. Blood Sampling in Newborns: A systematic review of YouTube videos. J Perinat Neonatal Nurs. 2017;31(2):160-5. [ Links ]

21. Frongia G, Mehrabi A, Fonouni H, Rennert H, Golriz M, Günther P. YouTube as a potential training resource for laparoscopic fundoplication. J Surg Educ. 2016;73(6):1066-71. [ Links ]

22. Syed-Abdul S, Fernández-Luque L, Jian W-S, Li Y-C, Crain S, Hsu M-H, et al. Misleading health-related information promoted through video-based social media: anorexia on YouTube. J Med Internet Res. 2013;15(2):e30. [ Links ]

23. Koller U, Waldstein W, Schatz K-D, Windhager R. YouTube provides irrelevant information for the diagnosis and treatment of hip arthritis. Int Orthop. 2016;40(10):1995-2002. [ Links ]

24. Salvador PTC de O, Costa TD da, Gomes AT de L, Assis YMS de, Santos VEP. Patient safety: characterization of YouTube videos. Rev Gaucha Enferm. 2017;38(1):e61713. [ Links ]

25. VanderKnyff J, Friedman DB, Tanner A. Framing life and death on YouTube: the strategic communication of organ donation messages by organ procurement organizations. J Health Commun. 2015;20(2):211-9. [ Links ]

26. Steadman M, Chao MS, Strong JT, Maxwell M, West JH. C U L8ter: YouTube distracted driving PSAs use of behavior change theory. Am J Health Behav. 2014;38(1):3-12. [ Links ]

27. Brna PM, Dooley JM, Esser MJ, Perry MS, Gordon KE. Are YouTube seizure videos misleading? Neurologists do not always agree. Epilepsy Behav [Internet]. 2013 [access: 2018/12//27];29(2):305-7. Available from: ]

28. Athanasopoulou C, Suni S, Hätönen H, Apostolakis I, Lionis C, Välimäki M. Attitudes towards schizophrenia on YouTube: A content analysis of Finnish and Greek videos. Inform Health Soc Care. 2016;41(3):307-24. [ Links ]

29. Tsai FJ, Sainbayar B. Portrayal of tobacco in Mongolian language YouTube videos: policy gaps. Tob Control. 2016;25(4):480-2. [ Links ]

30. Nam HK, Bang SM, Rhie YJ, Park SH, Lee K-H. Qualitative assessment of precocious puberty-related user-created contents on YouTube. Ann Pediatr Endocrinol Metab. 2015(3):143-9. [ Links ]

31. Covolo L, Ceretti E, Passeri C, Boletti M, Gelatti U. What arguments on vaccinations run through YouTube videos in Italy? A content analysis. Hum Vaccines Immunother. 2017;13(7):1693-9. [ Links ]

32. Singh AG, Singh S, Singh PP. YouTube for information on rheumatoid arthritis-a wakeup call? J Rheumatol. 2012;39(5):899-903. [ Links ]

33. Krauss MJ, Sowles SJ, Stelzer-Monahan HE, Bierut T, Cavazos-Rehg PA. 'It Takes Longer, but When It Hits You It Hits You!': Videos about marijuana edibles on YouTube. subst use misuse. 2017;52(6):709-16. [ Links ]

34. Ajumobi AB, Malakouti M, Bullen A, Ahaneku H, Lunsford TN. YouTubeTM as a Source of Instructional Videos on Bowel Preparation: a Content Analysis. J Cancer Educ Off J Am Assoc Cancer Educ. 2016;31(4):755-9. [ Links ]

35. Al-Busaidi IS, Anderson TJ, Alamri Y. Qualitative analysis of Parkinson's disease information on social media: the case of YouTubeTM. EPMA J. 2017;8(3):273-7. [ Links ]

36. Fat MJL, Doja A, Barrowman N, Sell E. YouTube videos as a teaching tool and patient resource for infantile spasms. J Child Neurol. 2011;26(7):804-9. [ Links ]

37. Knight K, van Leeuwen DM, Roland D, Moll HA, Oostenbrink R. YouTube: are parent-uploaded videos of their unwell children a useful source of medical information for other parents? Arch Dis Child. 2017;102(10):910-4. [ Links ]

38. Camm CF, Sunderland N, Camm AJ. A quality assessment of cardiac auscultation material on YouTube. Clin Cardiol. 2013;36(2):77-81. [ Links ]

39. Azer SA, AlEshaiwi SM, AlGrain HA, AlKhelaif RA. Nervous system examination on YouTube. BMC Med Educ [Internet]. 2012 [cited: 2018/12/27];12(1):126. Available from: ]

40. Stellefson M, Chaney B, Ochipa K, Chaney D, Haider Z, Hanik B, et al. YouTube as a source of chronic obstructive pulmonary disease patient education: a social media content analysis. Chron Respir Dis. 2014;11(2):61-71. [ Links ]

41. Erdem H, Sisik A. The Reliability of Bariatric Surgery Videos in YouTube Platform. Obes Surg. 2018;28(3):712-6. [ Links ]

42. Pandey A, Patni N, Singh M, Sood A, Singh G. YouTube as a source of information on the H1N1 influenza pandemic. Am J Prev Med. 2010;38(3):e1-3. [ Links ]

43. Singh SK, Liu S, Capasso R, Kern RC, Gouveia CJ. YouTube as a source of information for obstructive sleep apnea. Am J Otolaryngol. 2018;39(4):378-82. [ Links ]

44. Sorensen JA, Pusz MD, Brietzke SE. YouTube as an information source for pediatric adenotonsillectomy and ear tube surgery. Int J Pediatr Otorhinolaryngol. 2014;78(1):65-70. [ Links ]

45. Duncan I, Yarwood-Ross L, Haigh C. YouTube as a source of clinical skills education. Nurse Educ Today [Internet]. 2013 [access: 2018/12/27];33(12):1576-80. Available from: ]

46. Basch CH, Mouser C, Clark A. Distracted driving on YouTube: implications for adolescents. Int J Adolesc Med Health. 2017 [access: 2018/12//27];31(2). Available from: ]

47. McLean JL, Suchman EL. Video Lecture Capture Technology Helps Students Study without Affecting Attendance in Large Microbiology Lecture Courses. J Microbiol Biol Educ [Internet]. 2016 [access: 2018/10/2];17(3). Available from: ]

48. Borusiak P, Langer T, Tibussek D, Becher T, Jenke AC, Cagnoli S, et al. YouTube as a source of information for children with paroxysmal episodes. Klin Padiatr. 2013;225(7):394-7. [ Links ]

49. Gupta HV, Lee RW, Raina SK, Behrle BL, Hinduja A, Mittal MK. Analysis of youtube as a source of information for peripheral neuropathy. Muscle Nerve. 2016;53(1):27-31. [ Links ]

50. Tanwar R, Khattar N, Sood R, Makkar A. Benign prostatic hyperplasia related content on YouTube: unregulated and concerning. Recenti Prog Med. 2015;106(7):337-41. [ Links ]

51. Shah C. ContextMiner: Supporting the Mining of Contextual Information for Ephemeral Digital Video Preservation | International Journal of Digital Curation. 2009 [access: 2018/12/17];4(2):171-83. Available from: ]

52. Khairutdinov RR, Nalimova IS, Sosnovskaya GI. Information competence development on the basis of professional-oriented video materials. Eur Res Stud J. 2017;20:136-44. [ Links ]

Received: January 23, 2019; Accepted: May 10, 2019

* Author for the correspondence. E-mail:

The author declares that interests conflict doesn't exist in this paper.

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License