Volume index - Journal index - Article index - Map ---- Back
Online courses, Massive Online Open Courses, MOOC, open educational resources, instructional design, data mining, content analysis, web-based instruction
MOOCs are a phenomenon of utmost interest to the scientific community due to their exponential growth (Liyanagunawardena, Adams & Williams, 2013; Martinez, Rodriguez & Garcia, 2014; Yuan & Powell, 2013). These courses are a worldwide expanding phenomenon and offer a clear example of disruption (Anderson & McGreal, 2012; Conole, 2013; Vázquez Cano-Lopez & Sarasola, 2013) due to low fees for participants, massive participation and their adaptation to new social needs regarding education. Hence, the disruptive nature of MOOCs can only be verified if taken as experiments to test new methodology, technology and new ways to organize education (Pernías & Lujan-Mora, 2013).
From a pedagogical point of view, the phenomenon can be seen as an ‘effervescence’ rather than a disruption (Roig, Mengual-Andrés & Suarez, 2014), which must not blind us to the reactions they stir. These courses are hosted by varied and diverse platforms, with different backgrounds and approaches which have given rise to MOOCs based on: web-based instruction, the Connectivist Theory and its pedagogical model (Siemens, 2005); tasks, according to competency-based accomplishments (Cormier & Siemens, 2010) and content (Pernías & Lujan-Mora, 2013; Vázquez-Cano, 2013).
Since the inception of MOOCs, the majority of studies have focused mainly on their concept and history of MOOCs; debating the challenges, possibilities and threats thereof; presenting case studies by examining one or more platforms and courses, and reflecting on student participation (Liyanagunawardena & al., 2013). Thereafter, the focus shifted mainly to the completion rates and course quality per se (Baxter & Haycock, 2014; Halawa, Greene & Mitchell, 2014; Jordan, 2014; Koutropoulos & al., 2012; Rodríguez, 2012) and their pedagogical principles (Glance, Forsey & Riley, 2013; Roig & al., 2014; Vázquez-Cano & al., 2013; Zapata, 2013); design and key components are scarcely addressed.
Despite the lack of consensus on how quality standards should be attained in MOOCs (Haggard, 2013), it is necessary to raise the issue in order to prevent MOOCs from becoming «poor quality video watching sessions of chatting professors which are the basis for a set of self-assessment questions and awarding certificates without prior authentication and no other concern except generating revenue» (Aguaded, 2013: 7-8).
It is therefore important to address what pre-course information is provided, the pedagogical approaches underlying the design, the level of student engagement, the role of course instructors, availability and degree of interaction, resource typology as well as certification structure and process (Vázquez-Cano, 2013; Zapata, 2013).
Research on these training approaches shows that they are founded on a decentralized control over teaching-learning processes (Baggaley, 2014). However, given the accessibility and reach of MOOCs there is almost by definition a wide spectrum of users with a variety of interests and motivations, approaches and learning styles; hence, one of the most difficult challenges is to provide authentic learning experiences, which require the design and development of interactive collaborative processes. Siemens (2005) states that cooperative and collaborative activities as well as interaction with technological resources have a direct impact on students, especially on the way they perceive and process information and on their learning process, thus prompting a new knowledge building approach. Given massive student participation the level of interactivity is addressed through the use of specific Web 2.0 collaborative and communicative tools: chat rooms and forums (Baxter & Haycock, 2014) to discuss concerns and share solutions; blogs, wiki-forums and social networks (Medina-Salguero & Aguaded, 2014), among others, for support and feedback.
Assessment normally conforms to final and summative processes that are determined by the type of accreditation awarded once the MOOCs has been successfully completed. In some cases, the objectives are small-scale goals carried out individually or in pairs which are assessed by means of surveys, questionnaires, quizzes, exams, problem sets and other processes that will automatically generate badges as evidence of learning.
In short, studies have focused on the characteristics of the platform providers and the success or failure of a given course (Fini, 2009) and less on the pedagogical aspects. If we want to maximize learning via analyzing and adapting teaching strategies to individuals, we must critically address the pedagogical design of the MOOCs to identify underlying trends in teaching and learning processes. On the basis of the aforementioned, the objectives of this research are:
•To analyze Spanish-language MOOC offering during a given period in order to establish a profile of the pedagogical components.
• To validate a tool that can guide the pedagogical design of MOOCs.
• To ascertain which components are unique to a MOOC from those dependent on the platform.
• To determine whether the pedagogical components of MOOCs are conditioned by platforms.
The purpose of this research1 is descriptive with an exploratory sequential mixed-methods design (DEXPLOS) (Creswell, Plano, Gutmann & Hanson, 2008; Hernández, Fernández & Baptista, 2010). This design involves an initial phase of qualitative data gathering and analysis followed by another where quantitative data is collected and analyzed, subsequently, we generate another database that integrates both and enables mixture analysis techniques (García, 2011).
Sequential and criteria sampling (McMillan & Schumacher, 2005) for mixed methods (Hernández & al., 2010) is used. The courses were selected according to the following criteria: catalogued in the repository www.MOOC.es; delivered in Spanish; course information available without prior registration; and provide a minimum amount of information to the data collecting instrument.
We therefore focus on ten platforms (Open UGR, Coursera, MiriadaX, Tutellus, Ucam, Udemy, UnedComa, UniMOOC, UNX, UPVX. We discarded Ehusfera (a blog hosting service rather than a MOOC platform) Iversity, CourseSites and edX, among others, given that the reference language is not Spanish. This involved analyzing 117 courses from different fields of knowledge available during the month of March 2014 (table 1). The low percentage of courses from Tutellus and Udemy is mainly due to two factors:
• They included material that did not conform to the MOOC concept, such as conferences, videoconferences or lectures on videos, recycled from different sources within the audiovisual repository of the institution and now offered as massive courses.
• They provided very limited information to the research instrument without prior registration. Moreover, there was redundant information on how to use the platform and on certification. It was also noted that there was a high degree of repetition, such that regardless of the course, the data provided was the same.
Consequently, these two platforms were not included in the qualitative sample. Thus, we can state that the remaining 104 courses represent 81.25% of the population.
For massive course analysis we developed INdiMOOC-EdI (Instrument for Educational and Interactive Indicators in MOOCs). It is an ad hoc data sheet that meticulously collects information provided in the full description of MOOCs. The elements that make up this instrument can be organized into four components, with a total of 27 sub-components rated on various scales (table 2).
To safeguard validity conditions, the first version of the instrument was subjected to the Delphi technique by evaluating the same courses during the same period of time and a pilot study of 15 courses within 5 different platforms. The expert panel (KC) index rated .75, while content validity (IVC) rated .99 which according to Lawshe (1975) is within the standard satisfactory level. Reliability and internal consistency were determined by Cronbach´s Alpha statistic after the sample gathering procedure i.e., once questions whose answers were measured on an interval scale were eliminated. The 117 courses obtained an alpha value of. 614. Some authors (Huh, Delorme & Reid, 2006; Nunnally, 1967) indicate that an alpha value between .5 and .6 is within the satisfactory standard in the early stages of research or in an exploratory study such as this one. This statistic combines the correlation coefficient of the items that make up the instrument and its dimensionality (Cortina, 1993).
A descriptive analysis of quantitative data was carried out according to the identifiers and descriptive features displayed in table 2, together with a categorical principal component analysis that enables a large set of variables to be grouped in a smaller number of explanatory components that stem from the variance among the original data.
With the qualitative data (interactive and educational features) we conducted a content analysis that deployed five major categories:
• Learning: styles, learning modality taking place and content format: self-directed learning, empirical and inductive learning, learning through observation; lessons, units, pills or modules.
• Activities and tasks: refers to both modality (compulsory or optional, individual or collective) and typology (questionnaires, tests, readings, practical exercises, problem sets, projects, case studies, questions and answers, participation…).
• Means and resources: traditional and technology-based: videos, slideshows, forums, blogs, wikis, e-mails, interviews, readings, optional additional material.
• Interactivity: or interacting with other people; online meetings, debates or discussions in pairs or groups, sharing doubts and knowledge, collaborative work, flexible and asynchronous communication.
• Assessment: existing assessment procedure, not only modalities and instruments, but also grades and endorsement (self-assessment, peer assessment; questionnaires, tests, rubrics, exams, problem solving, –peertopeer); percentage or weighting in the final grade, grading scale, passing grade, minimum percentage; checking student progress and final endorsement.
Figure 1 shows the relationship between categories and associated codes, taking into account that some codes belong to more than one category.
Figure 1: Existing relationship between categories and associated codes.
Two algorithms of data mining were applied, which will later be described in detail herein: first, a classification algorithm to discern the impact of platforms on the instructional and communication designs underlying the courses; second, an assessment algorithm to ascertain the degree of information provided by the variable course regardless of the platform provider.
The analysis conducted with the information compiled reveals that the 98.3% of the courses (n=115) have the title somewhere clearly visible, crucial in order to engage participants´ interest, as well as a limited registration period (n=38, 32.5%) or unlimited registration period (n=34, 29.1%). In 38.5% of the cases (n=45) registration was closed during the study timeframe.
A total of 72.6% (n=85) are sponsored by platforms linked to Higher Education; whilst personal initiative (n=13, 11.1%) or private company sponsors (n=1, 0.9%) are less frequent. As far as fields of knowledge are concerned, almost half of the MOOCs relate to Legal and Social fields (n=49, 41.9%), followed by multidisciplinary MOOCs (n=21, 17.9%), Arts and Humanities together with Science MOOCs (n=15 each, 12.8%). The least offered are Technology (n=10, 8.5%) and Health Sciences MOOCs (n=7, 6%).
Courses analyzed, only n=49 (41.9%) of the participants specify course relevance. More than half (n=63, 53.8%) lack addressee information. When addressing target participants, n=34 (29.1%) they note for public at large and n=20 (17.1%) establish a specific profile. In almost 60% of the courses (n=70, 59.8%) there are no prerequisites. Regarding the two last issues there are five important aspects that prompt registration:
• Including an Introduction to the course in the MOOC website. Almost half of the introductions deal with content (n=47, 40.2%), followed by 38.5% (n=45) which focus on the topic, without being too concise. The rest (n=25, 21.4%) address issues such as timing, objectives, using the system, carrying out tasks, etc.
• Having an introductory video, available in practically all of the courses analyzed (n=98, 83.8%).
• Having and defining objectives is omitted in more than half of the courses (n=67, 57.3%).
• MOOC related courses were only available in a small percentage of the courses (n=38, 32.5%)
• Operation of the system is specifically addressed in the platform in most MOOCs (n=91, 77.8%), only in 9.4% (n=11) is this guidance provided via course. In n=14 (12%) it is not specified.
The length of the MOOC analyzed is normally limited to weeks (n=87, 74.4%) ranging from 6 weeks (n=22, 36.7%) to 7-8 weeks (n=19, 31.7%). Thus, unlimited course length is a rare occurrence (n=11, 9.4%). Furthermore, the duration of engagement is specified in n=83 (70.9%), generally ranging from 3 hours (n=28, 46.7%) to more than 5 hours per week (n=19, 31.7%). Less than 2 hours of weekly engagement is infrequent (n=5, 8.3%).
A high percentage (n=84, 71.8%) of courses present the MOOC teaching team in a visible area, with an average of 3 to 4 tutors (M=3.32 and SD=3.148). This information is not displayed in only 17.1% (n= 20) and the remaining courses (n=11, 9.4%) provide no information at all.
Regarding course content there is a tendency to adopt an open structure, lessons or modules (n=90, 76.9%) with an average of 8 modules per MOOC. Less frequently (n=22, 18.8%), it appears that the work plan is limited to weeks only in closed structure courses. There is no information available in n=5 courses (4.3%)..
As for certification, there is a combination (n=75, 64.1%) of free of charge and charge-bearing modalities. Regarding the type of accreditation, it is normally mixed (n=71, 60.7%), certificates, credentials, badges, medals, and so on.
The content analysis resulting from the five categories (activities and tasks, learning, assessment, interactivity, means and resources) previously mentioned displays the given trend within each platform (figure 2).
Figure 2: Educational and interactive features related to the platform.
Regardless of the number of courses within each platform, it is noted that Coursera offers higher quality information with regard to educational and interactive features, followed by MiriadaX and UNED-COMA. On the other hand, and except in the aforementioned three platforms, it is observed that platforms are more vulnerable to and deficient in features such as means and resources, activities, tasks and assessment.
To address this issue a categorical principal component analysis (CATPCA) is carried out, which is non-linear and therefore does not require the strict assumptions of principal component analyses (Molina & Espinosa, 2010), regarding two dimensions as necessary and sufficient to yield representation (figure 3).The data obtained confirms that the amount of variance accounted for by these two dimensions is not high (s2=10.64%), but underscores a substantial difference among courses within different platforms. In the first dimension the saturating variables are: certification (.943), engagement (.905), dedication_hours (.899), accreditation (.864), registration (-.872) and institution (-.883). The variables that saturate the second dimension are: introductoryvideo (-.717), teaching team (80,625), faculty profiles (.608) and modules (-.629). Although there are variables that do not cluster significantly in any dimension, it is true that the vast majority have opposing values in one or another.
Figure 3: Object points/components labeled by means of platform.
Plotting the two dimensions in a coordinate axis displays how courses are grouped according to the platform provider. The outcome reveals the following facts:
• Some platforms are more extensive than others, for instance UniMOOC is within the values x=-2 and y=2, whilst MiriadaX extends from x=-1.
• There is certain affinity among platforms, which can lead to conglomerates, inter alia, Coursera and MiriadaX, UniMOCC and UPVX...
• In the sample studied the pedagogical components of each course are fully determined by the platform that hosts them.
To give an in-depth insight, the data collected were subjected to an algorithm classification with Weka software (Hall & al., 2009).Since we are dealing with a collection of automatic learning algorithms for data mining tasks, the platforms under investigation are regarded as the classification variable and producing ten rules that classify 100% of the courses in a platform. The algorithm used was PART (Frank & Witten, 1998), a variant of the C4.5 of Quinlan (1993). As an example we provide a fragment of three rules:
The outcome demonstrates that the relevance of the platform is greater than that of each course when it comes to the pedagogical design. For instance, we observed that in MiriadaX, where more courses were analyzed, both types of certification, dedication and limited course length, together with not displaying related courses, are associated with this platform (n=55.0, i.e., all the courses examined).
To examine what components are more specific to the course than to the platform, the data were subjected to an algorithm implemented in Weka which assesses the rate of each attribute by measuring the information gain3 (Witten, Frank & Hall, 2011) according to java class platform (table 3).
It is noted that most of the information provided by the variables is related to the platform. The title and interactivity variables do not display any variance; hence it can be attributed neither to course or platform. Figure 4 shows the values obtained through the algorithm once standardized. If negative values are to be taken as typical course variables, the following should be considered: Field, Introductory Video, Introduction, Target participants, Modules, Objectives, Teaching team, Importance to the public, Prerequisites and Lenght/weeks. If we increase to a DT=-1, the typical course variables would be Weeks and Prerequisites.
Figure 4: Relationship between course and platform.rma.
An overview of literature addressing MOOCs emphasizes the relevance of pedagogy in MOOCs. The use of the instrument (INdiMOOC-EdI), which enables an analysis of these components, was contemplated and implemented and has been applied to a total of 117 Spanish-language courses.
The study on the data obtained with the aforementioned instrument regarding descriptive, educational and interactive features determines that it can be employed as a benchmark of indicators in order to attain the desirable pedagogical design in a MOOC Some of the findings affirm, along with Glance & al. (2013), that MOOCs have a sound pedagogical basis and there is no reason to assume that they are less effective than other learning experiences.
Initially the variable course is analyzed; however, the data refer to the support platforms. The information obtained underscores a series of components that are not persistent in the vast majority of the courses analyzed. For instance, platforms do not regard it essential to specify: target participants, prerequisites and a clear and concise introduction about the course content or other related courses. Nonetheless, it does seem necessary to display: an introductory video; objectives, teaching team, length and weekly dedication, operation of the system, means and resources, activities and assessment. It has been shown that assessment is one of the most evident pedagogical benefits of MOOCs (Glance & al., 2013).
The existing profile of Spanish-language MOOCs, drawn from the pedagogical features in more than 70% of the courses examined, includes key features such as displaying the course title in a visible place (98.3%); an introductory video (83.8%); specifically addressing operation of the system (77.8%); an open structure, modules or lessons (76.9%) with an average of 8 modules per MOOC; course length limited to weeks (74.4%); platform provider linked to Higher Education (72.6%), displaying the teaching team in a visible spot (71.8%) and specifying the number of hours of weekly dedication (70.9%).
Although the platform offers the necessary technological support, it is obvious that MOOC proposals should have their own autonomy. Contrary to what is asserted in the study by Roig & al. (2014), which found no significant variance between the pedagogical quality of MOOC and the platform, it is ascertained that platforms determine the pedagogical design of the courses. The same pattern comprising, inter alia, activities and materials, learning modalities, assessment proposals, level of interactivity, access and certification is repeated over and over again (as many times as the number of platforms examined). The characteristics of the activities students carry out along with tutor counselling and didactic interventions are key elements in predicting the rate of disengaging and drop-outs (Halawa & al., 2014).
The data implies that platforms condition the pedagogical designs of MOOCs (figure 3), but this does not necessarily imply the existence of a pedagogical model underlying the MOOC proposal. I.e., the platform constrains and restricts online courses, albeit some platforms deploy a degree of flexibility, with fluid boundaries among the different features in INdiMOOC-EdI.
If MOOCs are regarded as a dynamic and global phenomenon, as an educational response to the emergence and development of movements and online social networks, as a cybernetic alternative to learning without frontiers, as a useful self-directed learning experience, as an extension of the classroom, as a space for free movement of knowledge, as an opportunity for democratization and universal access to specialized content, as a training proposal with pedagogic autonomy..., then let us take advantage of these mentoring platforms whilst MOOCs have not fully matured.
Finally, an exploratory study such as the one carried out provides an outline of the situation, but it is faced with certain constraints that should be addressed in future research, such as an in-depth insight into the field or methodological complementariness. It would be convenient to thoroughly examine a specific course or courses in specific fields of knowledge; the standpoint in other languages; or if low completion rates can be due to the pedagogical design. As stated by Bartolomé (2013), we still lack a pedagogical framework that will validate that a MOOCs teaches and that a MOOC generates knowledge. Further research is needed for ongoing progress and consolidation. There is as yet the need to debug concepts, models and experiences..., overcome certain difficulties and minimize others; some MOOCs and platforms will lag behind, but many others will continue to be designed, developed and improved for millions of people around the world.
1 The study was carried out during the last academic year in response to an institutional innovation project commissioned to the research group which includes the authors of this paper.
2 Despite the evident educational value objectives possess, they are included in the descriptive features in order to simply determine whether they are present or not, since according to Roig & al. (2014: 37): ‘The existence of explicit learning objectives is associated with a high score in the pedagogical quality of MOOCs’.
3 Information Gain=Class Entropy - Entropy (class / attribute)= Class H-H (Class /Attribute).
Aguaded, I. (2013). La revolución MOOCs, ¿una nueva educación desde el paradigma tecnológico? Comunicar, 41, 7-8. (DOI: http://doi.org/tnh).
Anderson, T. & McGreal, R. (2012). Disruptive Pedagogies and Technologies in Universities. Education, Technology and Society, 15, 4, 380-389. (http://goo.gl/H1mTkh) (25-11-2013).
Baggaley, J. (2014). MOOC Postscript. Distance Education, 35, 1, 126-132. (DOI: http://doi.org/tnj).
Bartolomé, A. (2013). Qué se puede esperar de los MOOC. Comunicación y Pedagogía, 269-270, 49-55.
Baxter, J.A. & Haycock, J. (2014). Roles and Student Identities in Online Large Course Forums: Implications for Practice. International Review of Research in Open and Distance Learning, 15, 1, 20-40. http://goo.gl/RxOzmt) (13-04-2014).
Conole, G. (2013). MOOCs as Disruptive Technologies: Strategies for Enhancing the Learner Experience and Quality of MOOCs. RED, 39, 1-18. (http://goo.gl/6Q8GLP) (24-03-2014).
Cormier, D. & Siemens, G. (2010). Throught the Open Door: Open Courses as Research, Learning & Engagement. Educase Review, 45, 4, 30-39. (http://goo.gl/AwTZhZ) (11-03-2014).
Cortina, J.M. (1993). What is Coefficient Alpha? An Examination of Theory and Applications. Journal of Applied Psychology, 78, 1, 98-104. (http://goo.gl/PL0fxu) (12-12-2013).
Creswell, J.W., Plano, V.L., Gutmann, M.L. & Hanson, W.E. (2008). Advanced Mixed Methods Research Designs. In V.L. Plano & J.W. Creswell (Eds.), The Mixed Methods Reader (pp. 161?196). Thousand Oaks, CA (USA): Sage.
Fini, A. (2009). The Technological Dimension of a Massive Open Online Course: The Case of the CCK08 Course Tools. The International Review of Research in Open and Distance Learning, 10, 5, 1-26. (http://goo.gl/YlU659) (08-09-2013).
Frank, E. & Witten, I.H. (1998). Generating Accurate Rule Sets without Global Optimization. Comunicación 15th International Conference on Machine Learning, Madison, Wisconsin. (http://goo.gl/FRQkET) (08-09-2013).
García, A. (2011). Técnicas actuales de estadística aplicada. Madrid: UNED.
Glance, D.G., Forsey, M. & Riley, M. (2013). The Pedagogical Foundations of Massive Open Online Courses. First Monday, 18, 5, 1-10. (DOI: http://doi.org/tkp).
Haggard, S. (2013). Massive Open Online Courses and Online Distance Learning: review. GOV.UK Research and Analysis. UK: Universities UK. (http://goo.gl/W3T6mO) (27-02-2014).
Halawa, S., Greene, D. & Mitchell, J. (2014). Dropout Prediction in MOOCs using Learner Activity Features. ELearning Papers, 37, 3-12. (http://goo.gl/l1vdWl) (19-03-2014).
Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P. & Witten, I. H. (2009). The WEKA Data Mining Software: An Update. SIGKDD Explorations, 11, 1, 10-18. (http://goo.gl/0k0a90) (23-11-2013).
Hernández, R., Fernández, C. & Baptista, P. (2010). Metodología de la investigación. Madrid: Pearson.
Huh, J., Delorme, D.E. & Reid, L.N. (2006). Perceived Third-Person Effects and Consumer Attitudes on Prevetting and Banning DTC Advertising. Journal of Consumer Affairs, 40, 1, 90-116. (DOI: http://doi.org/dpj596).
Jordan, K. (2014). Initial Trends in Enrolment and Completion of Massive Open Online Courses. The International Review of Research in Open and Distance Learning, 15, 1, 133-160. (http://goo.gl/PHWxaJ) (17-04-2014).
Koutropoulos, A., Gallagher, M.S., Abajian, S.C., deWaard, I., Hogue, R.J., Keskin, N.Ö. & Rodriguez, C.O. (2012). Emotive Vocabulary in MOOCs: Context & Participant Retention. European Journal of Open, Distance and E-Learning. 1, 1-23. (http://goo.gl/xO6dHU) (21-11-2013).
Lawshe, C.H. (1975). A Quantitative Approach to Content Validity. Personnel Psychology, 28, 563-575. (http://goo.gl/ql6Gyn) (26-06-2014).
Liyanagunawardena, T., Adams, A. & Williams, A. (2013). MOOCs: A Systematic Study of the Published Literature 2008-12. The International Review of Research in Open and Distance Learning, 14, 3, 202-227 (http://goo.gl/6vLnt8) (20-03-2014).
Martínez, F., Rodríguez, M.J. & García, F. (2014). Evaluación del impacto del término «MOOC» vs «Elearning» en la literatura científica y de divulgación. Revista de Currículum y Formación del Profesorado, 18, 1, 186-201. (http://goo.gl/HZPhKX) (25-06-2014).
McMillan, J. & Schumacher, S. (2005). Investigación educativa. Madrid: Pearson.
Medina, R. & Aguaded, I. (2014). Los MOOC en la plataforma educativa MiriadaX. Revista de Currículum y Formación del Profesorado, 18, 1, 137-153. (http://goo.gl/QCTZqL) (23-06-2014).
Molina, O. & Espinosa, E. (2010). Rotación en análisis de componentes principales categórico: un caso práctico. Metodología de encuestas, 12, 63-88.
Nunnally, J.C. (1967). Psychometric Theory. New York: McGraw-Hill.
Pernías, P. & Luján, S. (2013). Los MOOC: Orígenes, historia y tipos. Comunicación y Pedagogía, 269-270, 41-47.
Quinlan, J.R. (1993). C4.5: Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann.
Rodríguez, C.O. (2012). MOOCs and the AI-Stanford Like Courses: Two Successful and Distinct Course Formats for Massive Open Online Courses. European Journal of Open, Distance and E-Learning, 2, 1-13. (http://goo.gl/JG2aix) (19-09-2013).
Roig, R., Mengual-Andrés, S. & Suárez, C. (2014). Evaluación de la calidad pedagógica de los MOOC. Profesorado. Revista de Currículum y Formación del Profesorado, 18, 1, 27-41. (http://goo.gl/hE7TSe) (23-06-2014).
Siemens, G. (2005). Connectivism: A Learning Theory for a Digital Age. International Journal of Instructional Technology and Distance Learning, 2, 1, 3-6. (http://goo.gl/MAzRa8) (11-09-2013).
Vázquez-Cano, E. (2013). El videoartículo: nuevo formato de divulgación en revistas científicas y su integración en MOOCs. Comunicar, 41, 83-91. (DOI: http://doi.org/tnk).
Vázquez-Cano, E., López, E. & Sarasola, J.L. (2013). La expansión del conocimiento en abierto: los MOOC. Barcelona: Octaedro.
Witten, H., Frank, E. & Hall, M. (2011). Data Mining. Practical Machine Learning Tools and Techniques. Elsevier: Burlington.
Yuan, L. & Powell, S. (2013). MOOCs and Open Education: Implications for Higher Education. UK: Cetis.
Zapata, M. (2013). MOOCs, una visión crítica y una alternativa complementaria: La individualización del aprendizaje y de la ayuda pedagógica. Campus Virtuales, 1 (II), 20-38. (http://goo.gl/2r98ZQ) (11-03-2014).