DEVELOPMENT OF A ROBOT ARM: A REVIEW

  • January 2018
  • Conference: 8th National Engineering Conference 2018
  • At: FEDERAL POLYTECHNIC BIDA, NIGER STATE. NIGERIA

Ibrahim Suleiman at The Federal Polytechnic Bida

  • The Federal Polytechnic Bida
  • This person is not on ResearchGate, or hasn't claimed this research yet.

Abstract and Figures

Forward kinematic; geometry of two link planar robot Figure 2 shows the forward kinematic geometry of a typical two link planar robot, Goto (2011) used the vector algebra solution to analyse the geometry of the two link planar robot and derived Equations 1, Equation 2 and Equation 3. The Denavid-Hartenberg (DH) convention is also used to determine the desired coordinate position (Chen and Chen, 2018). x = l 1 cos(θ 1 ) + l 2 cos(θ 1 +θ 2 ) (1) y = l 1 sin(θ 1 ) + l 2 sin(θ 1 +θ 2 ) (2) θ = θ 1 +θ 2 (3) Inverse kinematics on the other hand deals with the problem of finding the appropriated joint angles to get a certain desired position and orientation of the end-effector (Siciliano, et al., 2009).

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

Mohammed Azher Therib

  • Mahdi Alkhafaji

Isak Karabegović

  • Zhi-Ying Chen

Chin-Tai Chen

  • Albano Lanzutti

Renato Vidoni

  • G. Ayorkor Mills-Tettey

Thrishantha Nanayakkara

  • Kurt E. Clothier

Ying Shang

  • L. Sciavicco

Villani Luigi

  • Giuseppe Oriolo

John Craig

  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of springeropen

Evolution of robotic arms

Michael e. moran.

Medical and Surgical Specialists, 6101 Pine Ridge Road, Naples, FL 34119 USA

The foundation of surgical robotics is in the development of the robotic arm. This is a thorough review of the literature on the nature and development of this device with emphasis on surgical applications. We have reviewed the published literature and classified robotic arms by their application: show, industrial application, medical application, etc. There is a definite trend in the manufacture of robotic arms toward more dextrous devices, more degrees-of-freedom, and capabilities beyond the human arm. da Vinci designed the first sophisticated robotic arm in 1495 with four degrees-of-freedom and an analog on-board controller supplying power and programmability. von Kemplen’s chess-playing automaton left arm was quite sophisticated. Unimate introduced the first industrial robotic arm in 1961, it has subsequently evolved into the PUMA arm. In 1963 the Rancho arm was designed; Minsky’s Tentacle arm appeared in 1968, Scheinman’s Stanford arm in 1969, and MIT’s Silver arm in 1974. Aird became the first cyborg human with a robotic arm in 1993. In 2000 Miguel Nicolalis redefined possible man–machine capacity in his work on cerebral implantation in owl-monkeys directly interfacing with robotic arms both locally and at a distance. The robotic arm is the end-effector of robotic systems and currently is the hallmark feature of the da Vinci Surgical System making its entrance into surgical application. But, despite the potential advantages of this computer-controlled master–slave system, robotic arms have definite limitations. Ongoing work in robotics has many potential solutions to the drawbacks of current robotic surgical systems.

Introduction

“If you will have the precision out of them, and make their fingers measure degrees like cog-wheels, and their arms strike curves like compasses, you must inhumanize them.” (J. Ruskin, The Stones of Venice [ 1 ])

Although surgical robotics is in its infancy, the rapid proliferation of surgical systems attests to the fact that this technology is here to stay and that we urologists should brace ourselves for the next wave of technology that will yet again change the way we work [ 2 ]. Many in practice are rather startled by the rapid insurgence of this sophisticated technology into the armamentarium of clinical practice. Many are overawed by the sophistication of the equipment that underlies the computer-enhanced technology that lurks “under the hood” of the da Vinci Surgical System (Intuitive Surgical, Sunnyvale, CA, USA). Yet one finds such suppositions are unfounded if one simply looks back on the steady progress leading to our current situation.

This is a historical overview of the history of the prime robotic surgical end-effector, the robotic arm. It is hoped that such an overview will better prepare the urologist to appreciate the pedigree of the sophisticated apparatus we are currently using and, potentially, anticipate the modifications and evolution this technology has for every aspect of urologic surgical practice. History is fascinating in that insights and trends can be used to emphasize ongoing basic research efforts and develop an enlightened opinion of the overall meaning of this technology to us as urologists.

The approach in this historical review will be a bit different from that in other published accounts of robotic technology that is increasingly proliferating [ 3 ]. The robotic arm will be the sole topic of this investigation and will be dissected rather like the human arm. Some context will be added for literary interest but the focus will be on a sequential timeline of development and how we arrived at a piano-wire based, seven degrees-of-freedom surgical system for urology that is now sweeping across the United States. The attempt is to thoroughly paint a scenario of human aspiration to achieve an augmented, human-like effector that would provide all of the advantages of mechanization and eliminate all of the potential disadvantages of the human actuator. Historical attempts before modern electrical systems will be investigated first. The joints of mechanical systems anthropomorphically reflect the human arm. The shoulder joint of modern mechanical arms will be addressed next. The elbow joint followed by the wrist will then be evaluated. Finally, the hand will be explored in all of the iterations to the present, which in some ways is the bridge from the past to present day surgical systems.

Where will all of this technology end you might ask? This technology, although in its infancy, has a historical legacy that is almost as intriguing as the software and hardware that now underlies these technological wonders. At the conclusion of this article, “cutting edge” basic research that is merging digital technology and robotics with neuroscience and cognitive research in what is often referred to as brain–machine interface systems will be presented. These fusion areas were the ultimate goals of those who began, so long ago, to dream of mechanical systems that would aid and relieve the ardors of labor and augment human performance. Nowhere in medicine is this more necessary than in surgery, where a deftly executed, minimally invasive procedure can alleviate so much pain and suffering [ 4 ]. When all is said and done, a well crafted tale can infuse a better understanding of the potential of these enabling technologies than a scientific review of the same. As the saying goes, “Chronology is the last refuge of the feeble minded and the only resort for historians.” [ 5 ].

da Vinci’s robotic arm

One can think of no finer place to start a historical dissertation on robotics than the master of Renaissance method, Leonardo da Vinci. In the early 1950s, investigators at the University of California scrutinized detailed drawings from da Vinci’s notebooks which together form a tome exceeding 1,119 pages dating from 1480 to 1518 and therefore referred to, like the great Atlantic ocean, as the Codex Atlanticus . da Vinci was profoundly influenced by classical Greek thinkers in art and in engineering. Modern investigations increasingly make it clear that he singularly pursued knowledge of everything known to these ancient scholars. He, in effect, was following in the footsteps of such figures as Hero of Alexandria, Philon, and Cstebius who were all reported to be interested in mechanically simulating motion and human attributes. Possibly inspired by quotes from Homer’s Iliad, “...since he was working on twenty tripods which were to stand against the wall of his strong-founded dwelling. And he had set golden wheels underneath the base of each one so that of their own motion they could wheel into the immortal gathering, and return to his house: a wonder to look at.” (Homer the Iliad , book 18). da Vinci began a systematic method of devising and building the sophisticated mechanical device that was 500 years ahead of its time. His first robotic design was in December 1478, at the age of 26, before he moved to Milan (Fig.  1 ). In the Codex Atlanticus, folio 812, is a power mechanism that features a front wheel drive, rack-and-pinion automobile. Impressive as it is, it was also fully programmable, with the ability to control its own motion and direction. It is now thought that this “base” would form the basis of his ultimate goal, an fully functional automaton [ 6 ].

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig1_HTML.jpg

Leonardo da Vinci’s front wheel drive, rack-and-pinion steering animated cart, photos courtesy of Biblioteca Ambrosiana in Milan

To animate a humanoid machine he was cognizant of his need to develop a more detailed database of human kinesiology. Leonardo grounded his knowledge further with drafting, anatomy, metal working, tool making, and armor design, in addition to painting and sculpture. Leonardo was not content with a simple understanding of human anatomy, so he began to investigate and draw comparative anatomy, to better appreciate form and function. “You should make a discourse concerning the hands of each of the animals, in order to show in what way they vary.” [ 7 ].

In 1495, at about the time he was working on his method of painting on wet plaster and the Last Supper, da Vinci designed and probably built the first of several programmable humanoid robots. From research ongoing at the Florence-based Institute and Museum of the History of Science and work by Rosheim it is now apparent his robot could open and close its anatomically correct jaw, sit up, wave its arms, and move its head [ 8 ]. This robot consisted of two independent systems (Fig.  1 ). The lower extremities had three degrees-of-freedom—legs, ankles, knees, and hips. The upper extremities had four degrees-of-freedom—arms with articulated shoulders, elbows, wrists, and hands. The orientation of the arms indicates it could only whole-arm grasp with the joints moving in unison. The device had an “onboard” programmable controller within the chest providing for power and control over the arms. The legs were powered by an external crank arrangement. The Florence-based Institute and Museum of the History of Science has developed sophisticated computer models of this design with streaming video animations. Leonardo probably returned to this design again to impress his erstwhile potential royal patron, Francis I of France. From Lomazzo’s writing about Leonardo in 1584, Francesco Melzi (one of his pupils, and heirs) states that Leonardo made several automatons from “birds, of certain material that flew through the air and a lion that could walk...the lion, constructed with marvelous artifice, to walk from its place in a room and then stop, opening its breast which was full of lilies and different flowers.” Rosheim believes that the leaf spring-powered cart could have powered the mechanical lion and his automaton knight. Leonardo’s multi-degrees-of-freedom automaton is an appropriate starting point for man’s technical interest in recapitulating form and function. da Vinci’s intense attention to detail will be a recurrent theme throughout this historical sojourn. In Leonardo’s own words, “With what words, O Writer, will you describe with like perfection the entire configuration which the drawing here does?” (da Vinci, 1513).

From automata to the Industrial Revolution

It has been suggested that the son of a glove-maker might well have been the spark that ignited the Industrial Revolution [ 9 ]. Jacques de Vaucanson was a gifted mechanical designer and builder of some of the most complex, clockwork automata throughout the eighteenth century. He was born in Grenoble in 1709 the youngest of ten children and began to show signs of his mechanical genius at a young age. Vaucanson too showed marked interest in the functioning of the human body, and is known to have attended classes in anatomy and medicine at the Jardin du Roi; he probably came into contact with Claude-Nicolas Le Cat (famed lithotomist). By 1738, the young entrepreneur had designed and built an automaton flute player, which was called an “androide”. By 1739 he had added two other automata to his exhibition, a pipe-and-drum player and a mechanical duck. The most popular and famous, by far, of all of his mechanical contrivances, was the duck [ 10 ]. Our interests here are mechanical arms, so attention to the duck and drummer will fade and we shall remain focused upon the flutist (Fig.  2 ). The price for admission to Vaucanson’s rented hall was significant, approximately three livres (one week’s salary in those days). The Abbe Desfontaines, who was agape about the human-like characteristics of the flutist describes the insides as containing an “infinity of wires and steel chains...form the movement of the fingers, in the same way as in living man, by the dilation and contraction of muscles.” Vaucanson gave a detail account of his android to the Academy of Sciences and, in fact, published and illustrated account [ 11 ].

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig2_HTML.jpg

de Vaucanson’s flute player, details of the finger mechanism are included

Others followed in Vaucanson’s wake. Most significant were the Swiss clock-making family named Jaquet-Droz. In 1774, the father, Pierre, with his son Henri-Louis, began to execute three life-sized automata with particular emphasis on their human-like capabilities. It is likely that the village surgeon helped with the development of the arms and hands of these androids. These craftsmen made every attempt to simulate a real human’s anatomy. They created an artist, a writer, and a musician. The musician played a clavichord by applying pressure to the keys with her fingertips (Fig.  3 ).

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig3_HTML.jpg

Jaquet-Droz’s 1774 lady musician, reproduced with permission [ 12 ]

The final automaton of interest in this series is Wolfgang von Kempelen’s chess player, often called the Turk [ 13 ]. It was constructed in 1769 for the Empress Maria Therese. The Turk was an elaborate hoax with a human operator concealed inside the complex cabinetry underneath the chessboard. The automaton though, had an ingenious system of mechanisms that automated the chess player’s left arm and hand. The chess player was a carved-wood figure that sat behind a wooden chest dressed in Turkish garb. The head moved on his neck, the eyes moved in their sockets, but the left arm and hand were magnificently orchestrated. The Turk engendered a wide variety of writings about the possibility of animating human reason and human activities. The mechanics of the arm were controlled by the “director”, the name given by those who knew that the games were human-controlled. Kempelen had designed a pantograph, a device that enabled the director to steer the automaton’s left arm from inside of the chest (Fig.  4 ). The limb would first be raised, then the hand would center over the desired chess piece to be moved. The arm would lower towards the piece and a collar would be turned to allow the end of a lever in his hand make the Turk’s fingers grasp the chess piece. The automaton’s fingers were wooden and during a match, the hand was placed inside a glove so it could grasp the chess pieces with more agility. Each finger had its own series of cables connected to the director’s pantograph.

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig4_HTML.jpg

von Kempelen’s Turk, chess player with illustrated left arm mechanisms

Robots of the World’s Fair

It is possible that in the recent history of the world only wars have had a more dramatic impact upon our society than expositions. The first industrial exposition occurred in Paris in 1798 and enabled the public to witness progress and technology that could change the lives of everyone. This process continued into the nineteenth century when the extraordinary potential of remote-controlled robotic devices was clearly demonstrated to an unsuspecting public at the 1898 Electrical Exhibition in Madison Square Garden, New York City. Nicola Tesla was at the height of his inventive prowess when he brought upon the unprepared world, a fully automated, remote-controlled robotic submersible boat (Fig.  5 ). “Teleautomata will ultimately be produced, capable of acting as if possessed of their own intelligence, and their advent will create a revolution.” (Tesla, 1898 [ 14 ]).

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig5_HTML.jpg

Nicola Tesla’s 1898 remote controlled robotic vessel

It would be 37 years and one World War later, at the San Diego Exposition, that the next robotic device would greet the public. A little known and not widely regarded 2,000-lb mechanical man was demonstrated by its inventor, Professor Harry May. Alpha, the robot’s name, was 6′2″ tall and could roll its eyes, open and close its mouth, sit and stand, move its arms, and fire a revolver (Fig.  6 ). In 1939, the super secret and far more popular mechanical man was introduced at the New York World’s Fair by the electronics giant, Westinghouse. Elektro was a spectacular hit at the Westinghouse Pavilion. Elektro would stand high above the audience on a platform and, supposedly, respond to English spoken commands (Fig.  6 ). Elektro was able to perform far more complex tasks than Alpha, he was able to move about on the stage with a strange sliding gait. Elektro was approximately 7 feet in height and cost the Westinghouse Corporation several hundred thousand dollars to make in Mansfield, Ohio [ 15 ]. Records of the company show they manufactured eight robots from 1931 to 1940. These robots could all move actuated arms and walk. Elektro used a 78-rpm record player to simulate conversation and had a vocabulary of more than 700 words. Elektro was captivating, he enthralled millions of visitors and went on tour after the World’s Fair and even appeared in a bad “B” movie, “Sex Kittens Go to College”, subtitled Beauty and the Robot. Most curious of all, these mechanical men were not called robots because Carel Kapek’s play, “Rossum’s Universal Robots” had not achieved the notoriety and cultural conversion of this word at this time.

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig6_HTML.jpg

The World’s Fair robots: far left , Harry May’s Alpha; right , three Westinghouse Elektro robots

The electronics in these early metal men were primitive, with loud electrical motor drivers and vacuum tube relays. They would be replaced with microcircuits and far more rapid, efficient, and quiet mechanics in the not too distant future. The World’s Fair phenomenon and robots continues to this day. The Worlds Expo 2005 was held in Aichi, Japan and closed in September with over 22,000,000 in attendance. The theme was “Nature’s Wisdom,” but the technology was definitely center stage. The robot assumed a key role with “we live in a robot age”. Working robots roved around the grounds and performed routine chores including sanitation, garbage collection, security, guide robots, child-care duties, and handicapped aid robots (Fig.  7 ). Multiple prototype robots were demonstrated for 11 days in June. The exhibition also had a “Robot Station” where visitors were able to interact with a whole host of robot-based venues. As is true of most such industrial expositions, manufacturers, including Toyota, Mitsubishi, and Brother Industries, were present to show-off their future technology [ 16 ].

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig7_HTML.jpg

Expo 2005 Aichi, Japan with increasingly sophisticated robots, and robot actuators (look at the arms)

Early modern robots and robotic arms

Now, with the advent of electronics and the incorporation of solid-state transistors instead of vacuum tubes, the evolution of the microcircuit and more rapid computer systems, the stage was set for early modern robotic arm evolution. The first “position controlling apparatus” was patented in 1938 by Willard Pollard (Fig.  8 ). This was a spray finishing robotic arm that had five degrees-of-freedom and an electrical control system. Although Pollard [ 17 ] never built his arm, his design and interest in an industrial application for automated robotic arms would spur on the ingenuity of others. Harold A. Roselund [ 18 ], working for De Vilbiss, developed another sprayer that was indeed manufactured. Both arms were very sophisticated for their time, and each solved movement at the respective joints in unique ways; the electronic controller systems lacked the fidelity required to make them broadly utilizable, however. The modern era of robotics was launched by the intrepid use of these two, little known arms developed in the late 1930s.

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig8_HTML.jpg

Early modern robotic arms: left , the Pollard painting arm; right , Unimate

Unimate introduced its first robotic arm in 1962 (Fig.  8 ) [ 19 ]. The arm was invented by George Devol and marketed by Joseph Engelberger. The first industrial arm was installed at the General Motors plant in Ternstedt, New Jersey, for automated diecasting. Ultimately, approximately 8,500 units were sold. Industrial robots graduated from the laboratory to the factory [ 20 ]. It is interesting that in this process the robotic arm’s movements and the degrees-of-freedom incorporated nautical terms for robotics—pitch, yaw, and roll.

Engelberger developed the first robotics company, called Unimation (from Devol’s Universal Automation robot), to sell their two-ton robotic arm, the Unimate. Unimation eventually sold 8,500 Unimates. Kawasaki bought the license to manufacture industrial robot arms from Unimation in 1966. Competition came quickly, the Cincinnati-based Milacron appeared, and by 1963 AMF Hermatool brought out their commercially available Versatran industrial robot which Japan imported in 1967. A whole host of academic centers became interested in the applications of microelectronics and the potential for these robotic arms (Fig.  9 ). A Stanford Research Institute investigator, Victor Scheinman, began working on electrically powered articulated arms that could move through six axes, which he called the Stanford arm. More complex tasks could now be given to the robotic arms. Marvin Minsky, then from MIT, built a robotic arm for the office of Naval Research, for possible underwater exploration. Twelve single-degree-of-freedom joints were used to actuate this electro-hydraulic high-dexterity arm. Scheinman continued his work on robotic arms and, with backing from General Motors, Unimation developed Scheinman’s technology into a Programmable Universal Machine for Assembly (PUMA).

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig9_HTML.jpg

From left to right , Rancho Arm, 1963; Minsky’s Tentacle Arm, 1968; the Stanford Arm, 1969; Silver Arm, 1974

The robotic arm

So we come to the robotic arm itself and applications to the medical field in particular. The most obvious method in this evolution was adaptation along the lines of human anatomy and kinesiology (Fig.  10 ).

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig10_HTML.jpg

Degrees-of-freedom in the robotic arm and the musculoskeletal system of the human equivalent

Shoulder joint

The shoulder joint is the highest load-bearing joint in the arm. The three degrees-of-freedom at the shoulder are pitch, yaw, and roll. The shoulder has the widest range of motion of any joint in the human body and is the foundation for most modern robotic arms. The horizontal flexion and extension (yaw) of the human shoulder is 160°. The forward flexion and hyperextension of the shoulder (pitch) is 240°. Finally, the medial and lateral rotation (roll) is 160°. In the normal human, the pitch and yaw are perpendicular to the arm, whereas the roll is in-line with the arm.

Elbow joint

The elbow joint provides extension, retraction, reach-around, and angular reorientation of the wrist and hand. Classically, the elbow provides 150° of pitch. Many types of mechanical elbow joint have been used in robotic arm manufacture. These include telescoping, revolute (subdivided by drive-train), intermediate, remote, and direct. Of these mechanical types, the revolute is most similar to the human arm. The telescoping was an early type of robotic arm joint, it deviates much from the human anatomic concept and applications have been limited.

Wrist joint

The wrist mechanisms developed for robotic arms were crucial in even the earliest prototypes (Fig.  9 ). The wrist is the end-effector terminus of the robotic arm and it allows the arm to be manipulated in three-dimensional space. Without a wrist, the mechanical arm would function more like Leonardo’s robot or some most modern crane arms. This joint is becoming increasingly complex in modern robots and is one of the fundamental features on the da Vinci Operating System. The robotic wrist is the sine qua non for high-performance robotic arms. If the human wrist moves 45° off center, ability to roll degenerates, resulting in gimbal locking. The earliest robotic applications of wrists were in the very first painting and welding robotic arms. Much more sophisticated wrists enable more dexterous teleoperated systems, but singularity problems are still a problem, and almost everyone who has used the da Vinci Surgical System has probably experienced gimbal locking of the wrist.

The hand is a “differentiated” end-effector of the robotic arm that defines the purpose and the capacity of the arm. The hand is a multi-tasking tool capable of diverse functions, for example grasping, manipulating, and pushing. A robotic hand has multiple control issues, both motor and sensory perception. Many universities are currently investigating this topic, more so than in industry.

The five principal types of robotic arm are: rectangular coordinate, polar coordinate, cylindrical coordinate, revolute coordinate, and self compliant automatic robot assembly (SCARA). Two more recent additions are called serpentine and anthropomorphic [ 21 ]. These arms can be subdivided by the types and complexity of each of their joints and control systems. The evolution of robotic arms is rapidly developing, however, and such schemes probably do more for organizing information than in defining the actual product. Applications to medicine, and surgery in particular, are ripe for companies, because classic fields of application, for example nuclear reactor work, have declined. In the past 40 years radical improvements have been made and more degrees-of-freedom are now possible. Downsizing and cost reduction will follow. Hand technologies will rapidly advance as computer-control issues improve and work at universities will find fruitful applications in industry and medicine. “Haptics” and other sensory systems will be added to advanced surgical robotics as this technology evolves.

It is necessary to discuss the two additional categories of robotic arm a bit further, because they may become more important to medical applications. The first is the serpentine robots. These devices were originally designed on the basis of the kinesiology of another complex biological joint, the spine (Fig.  11 ). The purpose of making serpentine robots was to produce a device with more degrees-of-freedom than the normal human arm. As computer-control algorithms advance and the means to control the complex maneuvers of >10–20 or 30 degrees-of-freedom become available, these systems have become increasingly complex. The first such systems were called “serpentine” because it was necessary for the robotic arms to “snake” through passages and pipes to inspect nuclear reactors, fuel tank baffles, and wing spars. To overcome the multiple-joint-control issues and prevent restrictive backlash, Miyake in 1986 described innovative solutions in control [ 22 ]. In 1968 the US Navy funded a spine-like arm for ocean exploration; this has been called the Scripps tensor arm. Another such ultrahigh-dexterity robotic arm, called the Articulating Mechanism, was developed by Ralph Mosher in 1969. It was a modular and low-cost alternative to the Scripps design, but was not as precise. Many of the space arms used on the United States Space Shuttle were serpentine. The arm designed by Frederick Wells in 1970 at the Marshall Space Flight Center in Huntsville, Alabama, was such a device. This arm has continued to evolve with improvements by Iwatsuka, in 1986, and by Wuenecher. Wuenecher called his device the remote control manipulator intended to aid astronauts. The Spine Robot is a Swedish-made serpentine robotic arm invented by Ove Larson and Charles Davidson in 1983. It consists of stacked ovoidal discs controlled by opposable cables. There are now many versions of this design which use bellows, U-joints, and pressurized capillary systems (Scheinman). In 1984, Motohiko Kuura designed expandable and contractible arms for serpentine applications. The final addition in this series is the 1991 modular robotic joint (MJR) arm invented by Mark Rosheim [ 21 ]. The advantages of this system are that it has more degrees-of-freedom than the human arm, increases modularity, and is fault tolerant, if one joint fails another is capable of providing the mobility needed to accomplish its task. Why would surgeons be at all interested in more degrees-of-freedom, you might wonder? Another coming technological tour-de-force is woundless surgery. This type of complex surgery is also called peroral, transgastric endoscopic surgery, and cholecystectomies, appendectomies, and tubal bandings have already been performed [ 23 , 24 ]. To achieve more complex tasks, for example nephrectomy or radical prostatectomy, an ultrahigh-dexterity robotic arm will be necessary.

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig11_HTML.jpg

Serpentine robotic arm with future potential for “woundless” surgery

Another aspect of ongoing work is funded by grants supporting the rehabilitation of handicapped individuals [ 25 ]. Neuro-enhanced prosthetics such as cochlear implants, retinal implants, and highly dexterous limb prosthetics are already available [ 26 ]. Direct neural control would produce hybrid devices with the reliability and control of robotic arms and a completely natural interface via the individual’s own neocortex. Patients with “locked in syndromes”, for example severe disabling cerebral vascular accidents or severe amyelotrophic lateral sclerosis (ALS), have already been implanted with new intracranial electrodes that can directly interact with computers [ 27 ]. Fusion technology threatens the way we think about our own humanity, perhaps our own neural plasticity will enable advanced control directly by use of our own thought (Fig.  12 ) [ 28 ].

An external file that holds a picture, illustration, etc.
Object name is 11701_2006_2_Fig12_HTML.jpg

Amputee with neural-interactive robotic prosthesis. Center is Kennedy’s implantable neural array for patients with “locked-in syndrome.” Right is the concept from Miguel Nicolelis’ laboratory at Duke University showing learned cortical control in old work monkeys for controlling a robotic arm both directly and remotely from Durham, N.C. to MIT in Boston [ 29 ]

Some believe invasive implantable arrays are the future mode of choice which will enable our neocortex to link directly to the computers and mechanical actuators that will enable precision control. Others, however, believe this can be accomplished without surgically implantable arrays, and that the EEG has the potential to be a brain–machine interface [ 30 ].

The robotic arm is finally becoming the tool envisaged by those workers whose legacy started this intellectual exercise. The robotic surgical system we currently use in Urology is the “tip of the iceberg” for robotic systems [ 31 ]. Although seemingly sprung on unsuspecting clinicians, these complex machines represent a long lineage of work beginning from early modern times and continuing to the present. Whether or not you believe this currently expensive technology will affect your practice, whether you believe you can do better yourself laparoscopically or via open surgical methods, and whether or not you believe the technology is moving faster than human social systems can handle it, there is no longer any doubt this is just the first of many potential incursions of the robotic arm into the surgical arena of the future. The robotic arm has an absolutely fascinating history of which this is just a brief glimpse.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • NEWS FEATURE
  • 28 May 2024
  • Correction 31 May 2024

The AI revolution is coming to robots: how will it change them?

  • Elizabeth Gibney

You can also search for this author in PubMed   Google Scholar

Humanoid robots developed by the US company Figure use OpenAI programming for language and vision. Credit: AP Photo/Jae C. Hong/Alamy

You have full access to this article via your institution.

For a generation of scientists raised watching Star Wars, there’s a disappointing lack of C-3PO-like droids wandering around our cities and homes. Where are the humanoid robots fuelled with common sense that can help around the house and workplace?

Rapid advances in artificial intelligence (AI) might be set to fill that hole. “I wouldn’t be surprised if we are the last generation for which those sci-fi scenes are not a reality,” says Alexander Khazatsky, a machine-learning and robotics researcher at Stanford University in California.

From OpenAI to Google DeepMind, almost every big technology firm with AI expertise is now working on bringing the versatile learning algorithms that power chatbots, known as foundation models, to robotics. The idea is to imbue robots with common-sense knowledge, letting them tackle a wide range of tasks. Many researchers think that robots could become really good, really fast. “We believe we are at the point of a step change in robotics,” says Gerard Andrews, a marketing manager focused on robotics at technology company Nvidia in Santa Clara, California, which in March launched a general-purpose AI model designed for humanoid robots.

At the same time, robots could help to improve AI. Many researchers hope that bringing an embodied experience to AI training could take them closer to the dream of ‘artificial general intelligence’ — AI that has human-like cognitive abilities across any task . “The last step to true intelligence has to be physical intelligence,” says Akshara Rai, an AI researcher at Meta in Menlo Park, California.

But although many researchers are excited about the latest injection of AI into robotics, they also caution that some of the more impressive demonstrations are just that — demonstrations, often by companies that are eager to generate buzz. It can be a long road from demonstration to deployment, says Rodney Brooks, a roboticist at the Massachusetts Institute of Technology in Cambridge, whose company iRobot invented the Roomba autonomous vacuum cleaner.

There are plenty of hurdles on this road, including scraping together enough of the right data for robots to learn from, dealing with temperamental hardware and tackling concerns about safety. Foundation models for robotics “should be explored”, says Harold Soh, a specialist in human–robot interactions at the National University of Singapore. But he is sceptical, he says, that this strategy will lead to the revolution in robotics that some researchers predict.

Firm foundations

The term robot covers a wide range of automated devices, from the robotic arms widely used in manufacturing, to self-driving cars and drones used in warfare and rescue missions. Most incorporate some sort of AI — to recognize objects, for example. But they are also programmed to carry out specific tasks, work in particular environments or rely on some level of human supervision, says Joyce Sidopoulos, co-founder of MassRobotics, an innovation hub for robotics companies in Boston, Massachusetts. Even Atlas — a robot made by Boston Dynamics, a robotics company in Waltham, Massachusetts, which famously showed off its parkour skills in 2018 — works by carefully mapping its environment and choosing the best actions to execute from a library of built-in templates.

For most AI researchers branching into robotics, the goal is to create something much more autonomous and adaptable across a wider range of circumstances. This might start with robot arms that can ‘pick and place’ any factory product, but evolve into humanoid robots that provide company and support for older people , for example. “There are so many applications,” says Sidopoulos.

The human form is complicated and not always optimized for specific physical tasks, but it has the huge benefit of being perfectly suited to the world that people have built. A human-shaped robot would be able to physically interact with the world in much the same way that a person does.

However, controlling any robot — let alone a human-shaped one — is incredibly hard. Apparently simple tasks, such as opening a door, are actually hugely complex, requiring a robot to understand how different door mechanisms work, how much force to apply to a handle and how to maintain balance while doing so. The real world is extremely varied and constantly changing.

The approach now gathering steam is to control a robot using the same type of AI foundation models that power image generators and chatbots such as ChatGPT. These models use brain-inspired neural networks to learn from huge swathes of generic data. They build associations between elements of their training data and, when asked for an output, tap these connections to generate appropriate words or images, often with uncannily good results.

Likewise, a robot foundation model is trained on text and images from the Internet, providing it with information about the nature of various objects and their contexts. It also learns from examples of robotic operations. It can be trained, for example, on videos of robot trial and error, or videos of robots that are being remotely operated by humans, alongside the instructions that pair with those actions. A trained robot foundation model can then observe a scenario and use its learnt associations to predict what action will lead to the best outcome.

Google DeepMind has built one of the most advanced robotic foundation models, known as Robotic Transformer 2 (RT-2), that can operate a mobile robot arm built by its sister company Everyday Robots in Mountain View, California. Like other robotic foundation models, it was trained on both the Internet and videos of robotic operation. Thanks to the online training, RT-2 can follow instructions even when those commands go beyond what the robot has seen another robot do before 1 . For example, it can move a drink can onto a picture of Taylor Swift when asked to do so — even though Swift’s image was not in any of the 130,000 demonstrations that RT-2 had been trained on.

In other words, knowledge gleaned from Internet trawling (such as what the singer Taylor Swift looks like) is being carried over into the robot’s actions. “A lot of Internet concepts just transfer,” says Keerthana Gopalakrishnan, an AI and robotics researcher at Google DeepMind in San Francisco, California. This radically reduces the amount of physical data that a robot needs to have absorbed to cope in different situations, she says.

But to fully understand the basics of movements and their consequences, robots still need to learn from lots of physical data. And therein lies a problem.

Data dearth

Although chatbots are being trained on billions of words from the Internet, there is no equivalently large data set for robotic activity. This lack of data has left robotics “in the dust”, says Khazatsky.

Pooling data is one way around this. Khazatsky and his colleagues have created DROID 2 , an open-source data set that brings together around 350 hours of video data from one type of robot arm (the Franka Panda 7DoF robot arm, built by Franka Robotics in Munich, Germany), as it was being remotely operated by people in 18 laboratories around the world. The robot-eye-view camera has recorded visual data in hundreds of environments, including bathrooms, laundry rooms, bedrooms and kitchens. This diversity helps robots to perform well on tasks with previously unencountered elements, says Khazatsky.

The Google DeepMind robotic arm RT-2 holding a toy dinosaur up off a table with a wide array of objects on it

When prompted to ‘pick up extinct animal’, Google’s RT-2 model selects the dinosaur figurine from a crowded table. Credit: Google DeepMind

Gopalakrishnan is part of a collaboration of more than a dozen academic labs that is also bringing together robotic data, in its case from a diversity of robot forms, from single arms to quadrupeds. The collaborators’ theory is that learning about the physical world in one robot body should help an AI to operate another — in the same way that learning in English can help a language model to generate Chinese, because the underlying concepts about the world that the words describe are the same. This seems to work. The collaboration’s resulting foundation model, called RT-X, which was released in October 2023 3 , performed better on real-world tasks than did models the researchers trained on one robot architecture.

Many researchers say that having this kind of diversity is essential. “We believe that a true robotics foundation model should not be tied to only one embodiment,” says Peter Chen, an AI researcher and co-founder of Covariant, an AI firm in Emeryville, California.

Covariant is also working hard on scaling up robot data. The company, which was set up in part by former OpenAI researchers, began collecting data in 2018 from 30 variations of robot arms in warehouses across the world, which all run using Covariant software. Covariant’s Robotics Foundation Model 1 (RFM-1) goes beyond collecting video data to encompass sensor readings, such as how much weight was lifted or force applied. This kind of data should help a robot to perform tasks such as manipulating a squishy object, says Gopalakrishnan — in theory, helping a robot to know, for example, how not to bruise a banana.

Covariant has built up a proprietary database that includes hundreds of billions of ‘tokens’ — units of real-world robotic information — which Chen says is roughly on a par with the scale of data that trained GPT-3, the 2020 version of OpenAI's large language model. “We have way more real-world data than other people, because that’s what we have been focused on,” Chen says. RFM-1 is poised to roll out soon, says Chen, and should allow operators of robots running Covariant’s software to type or speak general instructions, such as “pick up apples from the bin”.

Another way to access large databases of movement is to focus on a humanoid robot form so that an AI can learn by watching videos of people — of which there are billions online. Nvidia’s Project GR00T foundation model, for example, is ingesting videos of people performing tasks, says Andrews. Although copying humans has huge potential for boosting robot skills, doing so well is hard, says Gopalakrishnan. For example, robot videos generally come with data about context and commands — the same isn’t true for human videos, she says.

Virtual reality

A final and promising way to find limitless supplies of physical data, researchers say, is through simulation. Many roboticists are working on building 3D virtual-reality environments, the physics of which mimic the real world, and then wiring those up to a robotic brain for training. Simulators can churn out huge quantities of data and allow humans and robots to interact virtually, without risk, in rare or dangerous situations, all without wearing out the mechanics. “If you had to get a farm of robotic hands and exercise them until they achieve [a high] level of dexterity, you will blow the motors,” says Nvidia’s Andrews.

But making a good simulator is a difficult task. “Simulators have good physics, but not perfect physics, and making diverse simulated environments is almost as hard as just collecting diverse data,” says Khazatsky.

Meta and Nvidia are both betting big on simulation to scale up robot data, and have built sophisticated simulated worlds: Habitat from Meta and Isaac Sim from Nvidia. In them, robots gain the equivalent of years of experience in a few hours, and, in trials, they then successfully apply what they have learnt to situations they have never encountered in the real world. “Simulation is an extremely powerful but underrated tool in robotics, and I am excited to see it gaining momentum,” says Rai.

Many researchers are optimistic that foundation models will help to create general-purpose robots that can replace human labour. In February, Figure, a robotics company in Sunnyvale, California, raised US$675 million in investment for its plan to use language and vision models developed by OpenAI in its general-purpose humanoid robot. A demonstration video shows a robot giving a person an apple in response to a general request for ‘something to eat’. The video on X (the platform formerly known as Twitter) has racked up 4.8 million views.

Exactly how this robot’s foundation model has been trained, along with any details about its performance across various settings, is unclear (neither OpenAI nor Figure responded to Nature ’s requests for an interview). Such demos should be taken with a pinch of salt, says Soh. The environment in the video is conspicuously sparse, he says. Adding a more complex environment could potentially confuse the robot — in the same way that such environments have fooled self-driving cars. “Roboticists are very sceptical of robot videos for good reason, because we make them and we know that out of 100 shots, there’s usually only one that works,” Soh says.

Hurdles ahead

As the AI research community forges ahead with robotic brains, many of those who actually build robots caution that the hardware also presents a challenge: robots are complicated and break a lot. Hardware has been advancing, Chen says, but “a lot of people looking at the promise of foundation models just don't know the other side of how difficult it is to deploy these types of robots”, he says.

Another issue is how far robot foundation models can get using the visual data that make up the vast majority of their physical training. Robots might need reams of other kinds of sensory data, for example from the sense of touch or proprioception — a sense of where their body is in space — say Soh. Those data sets don’t yet exist. “There’s all this stuff that’s missing, which I think is required for things like a humanoid to work efficiently in the world,” he says.

Releasing foundation models into the real world comes with another major challenge — safety. In the two years since they started proliferating, large language models have been shown to come up with false and biased information. They can also be tricked into doing things that they are programmed not to do, such as telling users how to make a bomb. Giving AI systems a body brings these types of mistake and threat to the physical world. “If a robot is wrong, it can actually physically harm you or break things or cause damage,” says Gopalakrishnan.

Valuable work going on in AI safety will transfer to the world of robotics, says Gopalakrishnan. In addition, her team has imbued some robot AI models with rules that layer on top of their learning, such as not to even attempt tasks that involve interacting with people, animals or other living organisms. “Until we have confidence in robots, we will need a lot of human supervision,” she says.

Despite the risks, there is a lot of momentum in using AI to improve robots — and using robots to improve AI. Gopalakrishnan thinks that hooking up AI brains to physical robots will improve the foundation models, for example giving them better spatial reasoning. Meta, says Rai, is among those pursuing the hypothesis that “true intelligence can only emerge when an agent can interact with its world”. That real-world interaction, some say, is what could take AI beyond learning patterns and making predictions, to truly understanding and reasoning about the world.

What the future holds depends on who you ask. Brooks says that robots will continue to improve and find new applications, but their eventual use “is nowhere near as sexy” as humanoids replacing human labour. But others think that developing a functional and safe humanoid robot that is capable of cooking dinner, running errands and folding the laundry is possible — but could just cost hundreds of millions of dollars. “I’m sure someone will do it,” says Khazatsky. “It’ll just be a lot of money, and time.”

Nature 630 , 22-24 (2024)

doi: https://doi.org/10.1038/d41586-024-01442-5

Updates & Corrections

Correction 31 May 2024 : An earlier version of this feature gave the wrong name for Nvidia’s simulated world.

Brohan, A. et al. Preprint at arXiv https://doi.org/10.48550/arXiv.2307.15818 (2023).

Khazatsky, A. et al. Preprint at arXiv https://doi.org/10.48550/arXiv.2403.12945 (2024).

Open X-Embodiment Collaboration et al. Preprint at arXiv https://doi.org/10.48550/arXiv.2310.08864 (2023).

Download references

Reprints and permissions

Related Articles

literature review of robotic arm

  • Machine learning

A guide to the Nature Index

A guide to the Nature Index

Nature Index 05 JUN 24

Standardized metadata for biological samples could unlock the potential of collections

Correspondence 14 MAY 24

A guide to the Nature Index

Nature Index 13 MAR 24

Need a policy for using ChatGPT in the classroom? Try asking students

Need a policy for using ChatGPT in the classroom? Try asking students

Career Column 05 JUN 24

What we do — and don’t — know about how misinformation spreads online

What we do — and don’t — know about how misinformation spreads online

Editorial 05 JUN 24

Meta’s AI system is a boost to endangered languages — as long as humans aren’t forgotten

Meta’s AI system is a boost to endangered languages — as long as humans aren’t forgotten

Meta’s AI translation model embraces overlooked languages

Meta’s AI translation model embraces overlooked languages

News & Views 05 JUN 24

Post-doctoral Fellow

Funded investigator has an immediate opening for a post-doctoral fellow to develop therapies for genetic diseases.

Johns Hopkins School of Medicine, East Baltimore Campus

Johns Hopkins University Department of Medicine

Postdoctoral Fellow PhD

Houston, Texas (US)

Baylor College of Medicine (BCM)

literature review of robotic arm

Unlock Your Potential with a PhD at Scuola Superiore Meridionale

Nestled in the vibrant city of Naples, Italy, SSM is a prestigious institution of higher learning and research

Naples (IT)

Scuola Superiore Meridionale

literature review of robotic arm

Faculty Positions at City University of Hong Kong (Dongguan)

CityU (Dongguan) warmly invites individuals from diverse backgrounds to apply for various faculty positions available at the levels of Professor...

Dongguan, Guangdong, China

City University of Hong Kong (Dongguan)

literature review of robotic arm

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

sustainability-logo

Article Menu

  • Subscribe SciFeed
  • Recommended Articles
  • Author Biographies
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Physical robots in education: a systematic review based on the technological pedagogical content knowledge framework.

literature review of robotic arm

1. Introduction

  • RQ1: What learning domain has been adopted for the application of robots in educational teaching?
  • RQ2: What teaching strategy has been used in the application of robots in educational teaching?
  • RQ3: What robot types have been used in the application of robots in educational teaching?
  • RQ4: What learning results have been identified in the application of robots in educational teaching?
  • RQ5: What problems with using robots have been identified in the application of robots in educational teaching?
  • RQ6: What robotic support has been identified in the application of robots in educational teaching?
  • RQ7: What robotic personality has been used in the application of robots in educational teaching?

2. Research Methods

2.1. literature search, 2.2. data selection, 2.3. coding schemes, 3.1. content knowledge—learning domain, 3.2. pedagogical knowledge—teaching strategy, 3.3. technological knowledge—robot types, 3.4. technological content knowledge—learning results, 3.5. technological pedagogical knowledge—problems with using robots, 3.6. pedagogical content knowledge—robotic support, 3.7. technological pedagogical content knowledge—robotic personality, 4. discussion, 4.1. content knowledge—learning domain, 4.2. pedagogical knowledge—teaching strategy, 4.3. technological knowledge—robot types, 4.4. technological content knowledge—learning results, 4.5. technological pedagogical knowledge—problems with using robots, 4.6. pedagogical content knowledge—robotic support, 4.7. technological pedagogical content knowledge—robotic personality, 5. conclusions, 5.1. contributions to the literature, 5.2. practical contributions, 5.3. limitations and further research endeavors, 5.4. implications of the findings, author contributions, institutional review board statement, informed consent statement, data availability statement, conflicts of interest.

ArticleLearning DomainLearning ResultsRobot TypesRobotic SupportRobotic PersonalityTeaching Strategy
Engwall and Lopes [ ]LanguagesBehavioralRobotic headsInformation supportExtraversionCommunication
Fridin [ ]LanguagesBehavioralHumanoid robotsInformation support and emotional supportOpennessCommunication
Hughes-Roberts, Brown [ ]Health, Medical or NursingBehavioralHumanoid robotsInformation supportConscientiousnessCommunication
Wu, Wang [ ]LanguagesAffective, behavioral, cognitiveFace or belly screen robotsInformation support and emotional supportExtraversionPhysical interaction
Özdemir and Karaman [ ]Health, Medical or NursingBehavioralHumanoid robotsInformation support and emotional supportExtraversionPhysical interaction
Banaeian and Gilanlioglu [ ]LanguagesAffective, behavioralHumanoid robotsInformation support and emotional supportExtraversionCLL
Yang, Luo [ ]Engineering or computersAffective, behavioral, cognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Noh and Lee [ ]ScienceCognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Hong, Huang [ ]LanguagesAffective, cognitiveProgrammable robotsInformation supportAgreeablenessCLL
Lei, Clemente [ ]Social science or social studiesBehavioralFace or belly screen robotsInformation supportNeuroticismCommunication
Engwall, Lopes [ ]LanguagesAffectiveRobotic headsInformation support and emotional supportExtraversionCommunication
Al Hakim, Yang [ ]LanguagesAffective, behavioralFace or belly screen robotsInformation support and emotional supportExtraversionRole play
Velentza, Fachantidis [ ]ScienceBehavioralHumanoid robotsInformation support and emotional supportAgreeablenessCommunication
Kewalramani, Kidman [ ]ScienceAffective, behavioral, cognitiveToy-like robotsInformation support and emotional supportAgreeablenessCommunication
Chen, Park [ ]LanguagesAffective, behavioralToy-like robotsInformation support and emotional supportOpennessRole play
Hung, Chao [ ]LanguagesAffectiveFace or belly screen robotsInformation support and emotional supportExtraversionPractice on specific learning material
Crompton, Gregory [ ]Social science or social studiesAffective, behavioral, cognitiveHumanoid robotsInformation support and emotional supportConscientiousnessPhysical interaction
Chen Hsieh [ ]LanguagesAffective, behavioralFace or belly screen robotsInformation support and emotional supportNeuroticismPractice on specific learning material
Wei, Hung [ ]MathematicsAffective, cognitiveToy-like robotsInformation support and emotional supportConscientiousnessCommunication
Alemi and Haeri [ ]LanguagesBehavioralHumanoid robotsInformation support and emotional supportOpennessPhysical interaction
Chang, Lee [ ]LanguagesBehavioralHumanoid robotsInformation support and emotional supportExtraversionPhysical interaction
Mioduser, Levy [ ]LanguagesCognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Iio, Maeda [ ]LanguagesCognitiveFace or belly screen robotsInformation support and emotional supportExtraversionRole play
Leeuwestein, Barking [ ]LanguagesBehavioralHumanoid robotsInformation support and emotional supportExtraversionCLL
Sen, Ay [ ]Engineering or computersCognitiveToy-like robotsInformation supportAgreeablenessPractice on specific learning material
Keane, Chalmers [ ]LanguagesAffective, behavioralHumanoid robotsInformation support and emotional supportAgreeablenessCommunication
David, Costescu [ ]Health, Medical or NursingCognitiveHumanoid robotsInformation supportExtraversionPhysical interaction
Kewalramani, Palaiologou [ ]Social science or social studiesAffective, behavioralToy-like robotsInformation support and emotional supportExtraversionRole play
Mitnik, Nussbaum [ ]ScienceCognitiveFace or belly screen robotsInformation supportAgreeablenessCLL
Resing, Bakker [ ]Social science or social studiesCognitiveToy-like robotsInformation support and emotional supportExtraversionCommunication
Kim, Marx [ ]Non-specifiedBehavioralToy-like robotsInformation support and emotional supportExtraversionCommunication
Chen Hsieh and Lee [ ]LanguagesAffective, behavioral, cognitiveFace or belly screen robotsInformation support and emotional supportNeuroticismPractice on specific learning material
Van den Berghe, de Haas [ ]LanguagesAffectiveHumanoid robotsInformation support and emotional supportConscientiousnessPractice on specific learning material
Nam, Kim [ ]ScienceCognitiveProgrammable robotsInformation support and emotional supportAgreeablenessPhysical interaction
Merkouris, Chorianopoulou [ ]ScienceCognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Han, Jo [ ]Arts or designAffective, behavioralFace or belly screen robotsInformation support and emotional supportExtraversionPractice on specific learning material
Konijn and Hoorn [ ]ScienceCognitiveHumanoid robotsInformation support and emotional supportExtraversionCommunication
Valente, Caceffo [ ]Social science or social studiesBehavioralToy-like robotsInformation supportAgreeablenessRole play
Yueh, Lin [ ]LanguagesBehavioralFace or belly screen robotsInformation support and emotional supportAgreeablenessPractice on specific learning material
Evripidou, Amanatiadis [ ]ScienceAffective, behavioral, cognitiveProgrammable robotsInformation support and emotional supportAgreeablenessPractice on specific learning material
Mazzoni and Benvenuti [ ]LanguagesBehavioral, cognitiveHumanoid robotsInformation support and emotional supportOpennessPhysical interaction
Lee, Noh [ ]LanguagesAffective, cognitiveFace or belly screen robotsInformation support and emotional supportOpennessRole play
Kim and Tscholl [ ]LanguagesAffective, behavioral, cognitiveFace or belly screen robotsInformation support and emotional supportExtraversionCLL
Shumway, Welch [ ]MathematicsBehavioral, cognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Liao and Lu [ ]LanguagesBehavioralFace or belly screen robotsInformation supportNeuroticismCommunication
Hsiao, Chang [ ]LanguagesAffective, behavioral, cognitiveFace or belly screen robotsInformation support and emotional supportExtraversionCLL
Çakır, Korkmaz [ ]Engineering or computersCognitiveProgrammable robotsInformation support and emotional supportAgreeablenessPhysical interaction
Chernyak and Gary [ ]Social science or social studiesAffective, behavioral, cognitiveToy-like robotsInformation support and emotional supportAgreeablenessPhysical interaction
Yang, Ng [ ]ScienceCognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Resing, Vogelaar [ ]ScienceCognitiveToy-like robotsInformation support and emotional supportExtraversionPhysical interaction
Brainin, Shamir [ ]Engineering or computersCognitiveProgrammable robotsInformation supportAgreeablenessPractice on specific learning material
Neumann, Neumann [ ]Arts or designBehavioralHumanoid robotsInformation support and emotional supportExtraversionPhysical interaction
Benvenuti and Mazzoni [ ]Social science or social studiesCognitiveHumanoid robotsInformation support and emotional supportConscientiousnessCommunication
Pop, Simut [ ]Social science or social studiesCognitiveFace or belly screen robotsInformation support and emotional supportExtraversionCommunication
Chevalier, Giang [ ]Engineering or computersCognitiveProgrammable robotsInformation supportAgreeablenessCommunication
Chew and Chua [ ]LanguagesBehavioralHumanoid robotsInformation support and emotional supportExtraversionCLL
Pérez-Marín, Hijón-Neira [ ]Social science or social studiesBehavioral, cognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Bravo, Hurtado [ ]Arts or designAffective, cognitiveProgrammable robotsInformation supportAgreeablenessRole play
Demir-Lira, Kanero [ ]LanguagesBehavioralHumanoid robotsInformation support and emotional supportConscientiousnessCLL
Alemi and Bahramipour [ ]LanguagesBehavioral, cognitiveHumanoid robotsInformation support and emotional supportExtraversionCLL
Cherniak, Lee [ ]Engineering or computersBehavioralProgrammable robotsInformation supportAgreeablenessPractice on specific learning material
Silva, Fonseca [ ]Engineering or computersCognitiveProgrammable robotsInformation supportAgreeablenessPractice on specific learning material
Arar, Belazoui [ ]LanguagesCognitiveRobotic headsInformation support and emotional supportExtraversionCLL
Khalifa, Kato [ ]LanguagesBehavioral, cognitiveHumanoid robotsInformation support and emotional supportConscientiousnessCommunication
Hall and McCormick [ ]ScienceCognitiveProgrammable robotsInformation support and emotional supportAgreeablenessPhysical interaction
Tolksdorf, Crawshaw [ ]LanguagesBehavioralHumanoid robotsInformation support and emotional supportConscientiousnessCLL
Ferrarelli and Iocchi [ ]ScienceBehavioral, cognitiveFace or belly screen robotsInformation supportAgreeablenessPhysical interaction
Ishino, Goto [ ]Social science or social studiesCognitiveHumanoid robotsInformation supportAgreeablenessCommunication
Alhashmi, Mubin [ ]Non-specifiedAffectiveHumanoid robotsInformation support and emotional supportConscientiousnessPhysical interaction
Welch, Shumway [ ]Engineering or computersCognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Keller and John [ ]Engineering or computersAffectiveHumanoid robotsInformation support and emotional supportExtraversionRole play
Paucar-Curasma, Villalba-Condori [ ]Engineering or computersAffective, cognitiveProgrammable robotsInformation support and emotional supportAgreeablenessPhysical interaction
Urlings, Coppens [ ]Social science or social studiesCognitiveProgrammable robotsInformation support and emotional supportAgreeablenessPractice on specific learning material
So, Wong [ ]Health, Medical or NursingBehavioral, cognitiveHumanoid robotsInformation supportExtraversionRole play
Liang and Hwang [ ]LanguagesBehavioralFace or belly screen robotsInformation supportAgreeablenessCommunication
Peura, Mutta [ ]LanguagesBehavioralHumanoid robotsInformation support and emotional supportConscientiousnessPractice on specific learning material
Veivo and Mutta [ ]LanguagesBehavioralHumanoid robotsInformation supportConscientiousnessPractice on specific learning material
Chung [ ]Arts or designBehavioral, cognitiveHumanoid robotsInformation support and emotional supportExtraversionPhysical interaction
Chang, Hwang [ ]Health, Medical or NursingAffective, behavioralFace or belly screen robotsInformation support and emotional supportExtraversionCommunication
Kalmpourtzis and Romero [ ]Social science or social studiesBehavioral, cognitiveToy-like robotsInformation supportAgreeablenessPhysical interaction
Saadatzi, Pennington [ ]LanguagesCognitiveHumanoid robotsInformation supportAgreeablenessPractice on specific learning material
Chiang, Cheng [ ]LanguagesCognitiveFace or belly screen robotsInformation supportConscientiousnessCLL
Cheng, Wang [ ]LanguagesBehavioral, cognitiveFace or belly screen robotsInformation support and emotional supportExtraversionRole play
Sabena [ ]ScienceBehavioral, cognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Kwon, Jeon [ ]Engineering or computersCognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Chen, Qiu [ ]Non-specifiedAffectiveHumanoid robotsInformation support and emotional supportExtraversionCommunication
Hsieh, Yeh [ ]LanguagesAffective, behavioral, cognitiveFace or belly screen robotsInformation support and emotional supportExtraversionCLL
Angeli and Georgiou [ ]Engineering or computersBehavioral, cognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Kim, Hwang [ ]Social science or social studiesAffective, behavioralToy-like robotsInformation support and emotional supportExtraversionCommunication
Bargagna, Castro [ ]Health, Medical or NursingBehavioral, cognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
Cervera, Diago [ ]Engineering or computersCognitiveProgrammable robotsInformation supportAgreeablenessPhysical interaction
So, Cheng [ ]Health, Medical or NursingAffective, behavioral, cognitiveHumanoid robotsInformation support and emotional supportExtraversionRole play
  • Alam, A. Possibilities and apprehensions in the landscape of artificial intelligence in education. In Proceedings of the 2021 International Conference on Computational Intelligence and Computing Applications (ICCICA), Nagpur, India, 26–27 November 2021; pp. 1–8. [ Google Scholar ]
  • Shimada, M.; Kanda, T.; Koizumi, S. How can a social robot facilitate children’s collaboration? In Proceedings of the Social Robotics: 4th International Conference, ICSR 2012, Chengdu, China, 29–31 October 2012; Proceedings 4. Springer: Berlin/Heidelberg, Germany, 2012; pp. 98–107. [ Google Scholar ]
  • Blanchard, S.; Freiman, V.; Lirrete-Pitre, N.J.P.-S.; Sciences, B. Strategies used by elementary schoolchildren solving robotics-based complex tasks. Innov. Potential Technol. 2010 , 2 , 2851–2857. [ Google Scholar ]
  • Lee, I.; Martin, F.; Denner, J.; Coulter, B.; Allan, W.; Erickson, J.; Malyn-Smith, J.; Werner, L.J. Computational thinking for youth in practice. Acm Inroads 2011 , 2 , 32–37. [ Google Scholar ] [ CrossRef ]
  • Alam, A. Social robots in education for long-term human-robot interaction: Socially supportive behaviour of robotic tutor for creating robo-tangible learning environment in a guided discovery learning interaction. ECS Trans. 2022 , 107 , 12389. [ Google Scholar ] [ CrossRef ]
  • Hussain, T.; Eskildsen, J.; Edgeman, R.; Ismail, M.; Shoukry, A.M.; Gani, S. Imperatives of sustainable university excellence: A conceptual framework. Sustainability 2019 , 11 , 5242. [ Google Scholar ] [ CrossRef ]
  • Mahnkopf, B. The ‘4th Wave of Industrial Revolution’—A Promise Blind to Social Consequences, Power and Ecological Impact in the Era of ‘Digital Capitalism’ ; EuroMemo Group: Vienna, Austria, 2019. [ Google Scholar ]
  • Schina, D.; Esteve-González, V.; Usart, M.; Lázaro-Cantabrana, J.-L.; Gisbert, M. The integration of sustainable development goals in educational robotics: A teacher education experience. Sustainability 2020 , 12 , 10085. [ Google Scholar ] [ CrossRef ]
  • Martín-Garin, A.; Millán-García, J.A.; Leon, I.; Oregi, X.; Estevez, J.; Marieta, C. Pedagogical approaches for sustainable development in building in higher education. Sustainability 2021 , 13 , 10203. [ Google Scholar ] [ CrossRef ]
  • Bond, M.; Zawacki-Richter, O.; Nichols, M.J. Revisiting five decades of educational technology research: A content and authorship analysis of the British Journal of Educational Technology. Br. J. Educ. Technol. 2019 , 50 , 12–63. [ Google Scholar ] [ CrossRef ]
  • Lai, C.L. Trends of mobile learning: A review of the top 100 highly cited papers. Br. J. Educ. Technol. 2020 , 51 , 721–742. [ Google Scholar ] [ CrossRef ]
  • Cheng, Y.-W.; Sun, P.-C.; Chen, N.-S.J. The essential applications of educational robot: Requirement analysis from the perspectives of experts, researchers and instructors. Comput. Educ. 2018 , 126 , 399–416. [ Google Scholar ] [ CrossRef ]
  • Xia, L.; Zhong, B.J. A systematic review on teaching and learning robotics content knowledge in K-12. Comput. Educ. 2018 , 127 , 267–282. [ Google Scholar ] [ CrossRef ]
  • Woo, H.; LeTendre, G.K.; Pham-Shouse, T.; Xiong, Y. The use of social robots in classrooms: A review of field-based studies. Educ. Res. Rev. 2021 , 33 , 100388. [ Google Scholar ] [ CrossRef ]
  • Chiu, M.-C.; Hwang, G.-J.; Tu, Y.-F. Roles, applications, and research designs of robots in science education: A systematic review and bibliometric analysis of journal publications from 1996 to 2020. Interact. Learn. Environ. 2022 , 1–26. [ Google Scholar ] [ CrossRef ]
  • Hwang, G.-J.; Chang, C.-Y. A review of opportunities and challenges of chatbots in education. Interact. Learn. Environ. 2021 , 31 , 4099–4112. [ Google Scholar ] [ CrossRef ]
  • Scassellati, B.; Boccanfuso, L.; Huang, C.-M.; Mademtzi, M.; Qin, M.; Salomons, N.; Ventola, P.; Shic, F. Improving social skills in children with ASD using a long-term, in-home social robot. Sci. Robot. 2018 , 3 , eaat7544. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Serholt, S. Breakdowns in children’s interactions with a robotic tutor: A longitudinal study. Comput. Hum. Behav. 2018 , 81 , 250–264. [ Google Scholar ] [ CrossRef ]
  • Westlund, J.M.K.; Dickens, L.; Jeong, S.; Harris, P.L.; DeSteno, D.; Breazeal, C.L. Children use non-verbal cues to learn new words from robots as well as people. Int. J. Child-Comput. Interact. 2017 , 13 , 1–9. [ Google Scholar ] [ CrossRef ]
  • Rayner, R.; Kerwin, K.; Valentine, N. Robot-Assisted Teaching—The Future of Education? In EcoMechatronics: Challenges for Evolution, Development and Sustainability ; Springer: Berlin/Heidelberg, Germany, 2022; pp. 329–357. [ Google Scholar ]
  • Sharkey, A.J. Should we welcome robot teachers? Ethics Inf. Technol. 2016 , 18 , 283–297. [ Google Scholar ] [ CrossRef ]
  • Zhang, S.; Che, S.; Nan, D.; Kim, J.H. MOOCs as a Research Agenda: Changes Over Time. Int. Rev. Res. Open Distrib. Learn. 2022 , 23 , 193–210. [ Google Scholar ] [ CrossRef ]
  • Sarkis-Onofre, R.; Catalá-López, F.; Aromataris, E.; Lockwood, C. How to properly use the PRISMA Statement. Syst. Rev. 2021 , 10 , 117. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Mishra, P.; Koehler, M.J. Introducing technological pedagogical content knowledge. In Proceedings of the Annual Meeting of the American Educational Research Association, New York, NY, USA, 24–28 March 2008; p. 16. [ Google Scholar ]
  • Engwall, O.; Lopes, J. Interaction and collaboration in robot-assisted language learning for adults. Comput. Assist. Lang. Learn. 2022 , 35 , 1273–1309. [ Google Scholar ] [ CrossRef ]
  • Albarracin, D.; Hepler, J.; Tannenbaum, M. General action and inaction goals: Their behavioral, cognitive, and affective origins and influences. Curr. Dir. Psychol. Sci. 2011 , 20 , 119–123. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Huang, W.; Hew, K.F.; Fryer, L.K.J. Chatbots for language learning—Are they really useful? A systematic review of chatbot-supported language learning. J. Comput. Assist. Learn. 2022 , 38 , 237–257. [ Google Scholar ] [ CrossRef ]
  • Leite, I.; Castellano, G.; Pereira, A.; Martinho, C.; Paiva, A. Empathic robots for long-term interaction: Evaluating social presence, engagement and perceived support in children. Int. J. Soc. Robot. 2014 , 6 , 329–341. [ Google Scholar ] [ CrossRef ]
  • Diener, E.; Lucas, R.E. Personality traits. In General Psychology: Required Reading ; NOBA: Salt Lake City, UT, USA, 2019; Volume 278. [ Google Scholar ]
  • Reich-Stiebert, N.; Eyssel, F. Learning with educational companion robots? Toward attitudes on education robots, predictors of attitudes, and application potentials for education robots. Int. J. Soc. Robot. 2015 , 7 , 875–888. [ Google Scholar ] [ CrossRef ]
  • Barak, M.; Assal, M. Robotics and STEM learning: Students’ achievements in assignments according to the P3 Task Taxonomy—Practice, problem solving, and projects. Int. J. Technol. Des. Educ. 2018 , 28 , 121–144. [ Google Scholar ] [ CrossRef ]
  • Alemi, M.; Haeri, N. Robot-assisted instruction of L2 pragmatics: Effects on young EFL learners’ speech act performance. Lang. Learn. Technol. 2020 , 24 , 86–103. [ Google Scholar ]
  • Chang, C.-W.; Lee, J.-H.; Chao, P.-Y.; Wang, C.-Y.; Chen, G.-D. Exploring the possibility of using humanoid robots as instructional tools for teaching a second language in primary school. J. Educ. Technol. Soc. 2010 , 13 , 13–24. [ Google Scholar ]
  • Crompton, H.; Gregory, K.; Burke, D. Humanoid robots supporting children’s learning in an early childhood setting. Br. J. Educ. Technol. 2018 , 49 , 911–927. [ Google Scholar ] [ CrossRef ]
  • Kewalramani, S.; Palaiologou, I.; Dardanou, M.; Allen, K.-A.; Phillipson, S. Using robotic toys in early childhood education to support children’s social and emotional competencies. Australas. J. Early Child. 2021 , 46 , 355–369. [ Google Scholar ] [ CrossRef ]
  • Hung, I.-C.; Chao, K.-J.; Lee, L.; Chen, N.-S. Designing a robot teaching assistant for enhancing and sustaining learning motivation. Interact. Learn. Environ. 2013 , 21 , 156–171. [ Google Scholar ] [ CrossRef ]
  • Neumann, M.M.; Neumann, D.L.; Koch, L.-C. Young children’s interactions with a social robot during a drawing task. Eur. Early Child. Educ. Res. J. 2022 , 31 , 421–436. [ Google Scholar ] [ CrossRef ]
  • Leeuwestein, H.; Barking, M.; Sodacı, H.; Oudgenoeg-Paz, O.; Verhagen, J.; Vogt, P.; Aarts, R.; Spit, S.; de Haas, M.; de Wit, J. Teaching Turkish-Dutch kindergartners Dutch vocabulary with a social robot: Does the robot’s use of Turkish translations benefit children’s Dutch vocabulary learning? J. Comput. Assist. Learn. 2021 , 37 , 603–620. [ Google Scholar ] [ CrossRef ]
  • Yueh, H.P.; Lin, W.; Wang, S.C.; Fu, L.C. Reading with robot and human companions in library literacy activities: A comparison study. Br. J. Educ. Technol. 2020 , 51 , 1884–1900. [ Google Scholar ] [ CrossRef ]
  • Ishino, T.; Goto, M.; Kashihara, A. Robot lecture for enhancing presentation in lecture. Res. Pract. Technol. Enhanc. Learn. 2022 , 17 , 1–22. [ Google Scholar ] [ CrossRef ]
  • Chen, H.; Park, H.W.; Breazeal, C. Teaching and learning with children: Impact of reciprocal peer learning with a social robot on children’s learning and emotive engagement. Comput. Educ. 2020 , 150 , 103836. [ Google Scholar ] [ CrossRef ]
  • Banaeian, H.; Gilanlioglu, I. Influence of the NAO robot as a teaching assistant on university students’ vocabulary learning and attitudes. Australas. J. Educ. Technol. 2021 , 37 , 71–87. [ Google Scholar ] [ CrossRef ]
  • Fridin, M. Storytelling by a kindergarten social assistive robot: A tool for constructive learning in preschool education. Comput. Educ. 2014 , 70 , 53–64. [ Google Scholar ] [ CrossRef ]
  • Alhashmi, M.; Mubin, O.; Baroud, R. Examining the use of robots as teacher assistants in UAE classrooms: Teacher and student perspectives. J. Inf. Technol. Educ. Res. 2021 , 20 , 245–261. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Wei, C.-W.; Hung, I.; Lee, L.; Chen, N.-S. A joyful classroom learning system with robot learning companion for children to learn mathematics multiplication. Turk. Online J. Educ. Technol.-TOJET 2011 , 10 , 11–23. [ Google Scholar ]
  • Lei, M.; Clemente, I.M.; Hu, Y. Student in the shell: The robotic body and student engagement. Comput. Educ. 2019 , 130 , 59–80. [ Google Scholar ] [ CrossRef ]
  • Resing, W.C.; Bakker, M.; Elliott, J.G.; Vogelaar, B. Dynamic testing: Can a robot as tutor be of help in assessing children’s potential for learning? J. Comput. Assist. Learn. 2019 , 35 , 540–554. [ Google Scholar ] [ CrossRef ]
  • van den Berghe, R.; de Haas, M.; Oudgenoeg-Paz, O.; Krahmer, E.; Verhagen, J.; Vogt, P.; Willemsen, B.; de Wit, J.; Leseman, P. A toy or a friend? children’s anthropomorphic beliefs about robots and how these relate to second-language word learning. J. Comput. Assist. Learn. 2021 , 37 , 396–410. [ Google Scholar ] [ CrossRef ]
  • Appel, M.; Izydorczyk, D.; Weber, S.; Mara, M.; Lischetzke, T. The uncanny of mind in a machine: Humanoid robots as tools, agents, and experiencers. Comput. Hum. Behav. 2020 , 102 , 274–286. [ Google Scholar ] [ CrossRef ]
  • Natarajan, M.; Gombolay, M. Effects of anthropomorphism and accountability on trust in human robot interaction. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 23–26 March 2020; pp. 33–42. [ Google Scholar ]
  • Stroessner, S.J.; Benitez, J. The social perception of humanoid and non-humanoid robots: Effects of gendered and machinelike features. Int. J. Soc. Robot. 2019 , 11 , 305–315. [ Google Scholar ] [ CrossRef ]
  • Wood, L.J.; Zaraki, A.; Robins, B.; Dautenhahn, K. Developing kaspar: A humanoid robot for children with autism. Int. J. Soc. Robot. 2021 , 13 , 491–508. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Yoon, Y.; Ko, W.-R.; Jang, M.; Lee, J.; Kim, J.; Lee, G. Robots learn social skills: End-to-end learning of co-speech gesture generation for humanoid robots. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 4303–4309. [ Google Scholar ]
  • Mou, Y.; Shi, C.; Shen, T.; Xu, K. A systematic review of the personality of robot: Mapping its conceptualization, operationalization, contextualization and effects. Int. J. Hum.–Comput. Interact. 2020 , 36 , 591–605. [ Google Scholar ] [ CrossRef ]
  • Luo, M.J.A. Exploring the Practice of Visual Communication Design in the New Era Based on the View of “Art+ Technology”. Asian J. Soc. Sci. Stud. 2017 , 2 , 8. [ Google Scholar ] [ CrossRef ]
  • Hwang, G.-J.; Xie, H.; Wah, B.W.; Gašević, D.J.C.; Intelligence, E.A. Vision, Challenges, Roles and Research Issues of Artificial Intelligence in Education ; Elsevier: Amsterdam, The Netherlands, 2020; Volume 1, p. 100001. [ Google Scholar ]
  • Toh, L.P.E.; Causo, A.; Tzuo, P.-W.; Chen, I.-M.; Yeo, S.H. A review on the use of robots in education and young children. J. Educ. Technol. Soc. 2016 , 19 , 148–163. [ Google Scholar ]
  • Loukatos, D.; Kondoyanni, M.; Kyrtopoulos, I.-V.; Arvanitis, K.G. Enhanced robots as tools for assisting agricultural engineering students’ development. Electronics 2022 , 11 , 755. [ Google Scholar ] [ CrossRef ]
  • Granados, D.F.P.; Yamamoto, B.A.; Kamide, H.; Kinugawa, J.; Kosuge, K. Dance teaching by a robot: Combining cognitive and physical human–robot interaction for supporting the skill learning process. IEEE Robot. Autom. Lett. 2017 , 2 , 1452–1459. [ Google Scholar ] [ CrossRef ]
  • Alimisis, D. Educational robotics: Open questions and new challenges. Themes Sci. Technol. Educ. 2013 , 6 , 63–71. [ Google Scholar ]
  • Ardito, G.; Mosley, P.; Scollins, L. We, robot: Using robotics to promote collaborative and mathematics learning in a middle school classroom. Middle Grades Res. J. 2014 , 9 , 73. [ Google Scholar ]
  • Huang, M.-H.; Rust, R.T. Engaged to a robot? The role of AI in service. J. Serv. Res. 2021 , 24 , 30–41. [ Google Scholar ] [ CrossRef ]
  • Robaczewski, A.; Bouchard, J.; Bouchard, K.; Gaboury, S. Socially assistive robots: The specific case of the NAO. Int. J. Soc. Robot. 2021 , 13 , 795–831. [ Google Scholar ] [ CrossRef ]
  • Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018 , 3 , eaat5954. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Han, J.; Jo, M.; Hyun, E.; So, H.-J. Examining young children’s perception toward augmented reality-infused dramatic play. Educ. Technol. Res. Dev. 2015 , 63 , 455–474. [ Google Scholar ] [ CrossRef ]
  • Mubin, O.; Stevens, C.J.; Shahid, S.; Al Mahmud, A.; Dong, J.-J. A review of the applicability of robots in education. J. Technol. Educ. Learn. 2013 , 1 , 13. [ Google Scholar ] [ CrossRef ]
  • Castro, E.; Cecchi, F.; Valente, M.; Buselli, E.; Salvini, P.; Dario, P. Can educational robotics introduce young children to robotics and how can we measure it? J. Comput. Assist. Learn. 2018 , 34 , 970–977. [ Google Scholar ] [ CrossRef ]
  • Robert, L. Personality in the human robot interaction literature: A review and brief critique. In Proceedings of the 24th Americas Conference on Information Systems, New Orleans, LA, USA, 16–18 August 2018; pp. 16–18. [ Google Scholar ]
  • Fong, T.; Nourbakhsh, I.; Dautenhahn, K. A survey of socially interactive robots. Robot. Auton. Syst. 2003 , 42 , 143–166. [ Google Scholar ] [ CrossRef ]
  • Sullivan, A.; Bers, M.U. Robotics in the early childhood classroom: Learning outcomes from an 8-week robotics curriculum in pre-kindergarten through second grade. Int. J. Technol. Des. Educ. 2016 , 26 , 3–20. [ Google Scholar ] [ CrossRef ]
  • Post, L.S.; Guo, P.; Saab, N.; Admiraal, W.J. Effects of remote labs on cognitive, behavioral, and affective learning outcomes in higher education. Comput. Educ. 2019 , 140 , 103596. [ Google Scholar ] [ CrossRef ]
  • Salas-Pilco, S.Z. The impact of AI and robotics on physical, social-emotional and intellectual learning outcomes: An integrated analytical framework. Br. J. Educ. Technol. 2020 , 51 , 1808–1825. [ Google Scholar ] [ CrossRef ]
  • Jeon, J. Chatbot-assisted dynamic assessment (CA-DA) for L2 vocabulary learning and diagnosis. Comput. Assist. Lang. Learn. 2021 , 36 , 1338–1364. [ Google Scholar ] [ CrossRef ]
  • Zhang, S.; Shan, C.; Lee, J.S.Y.; Che, S.; Kim, J.H. Effect of chatbot-assisted language learning: A meta-analysis. Educ. Inf. Technol. 2023 , 28 , 15223–15243. [ Google Scholar ] [ CrossRef ]
  • Wang, X.; Li, L.; Tan, S.C.; Yang, L.; Lei, J. Preparing for AI-enhanced education: Conceptualizing and empirically examining teachers’ AI readiness. Comput. Hum. Behav. 2023 , 146 , 107798. [ Google Scholar ] [ CrossRef ]
  • Conti, D.; Trubia, G.; Buono, S.; Di Nuovo, S.; Di Nuovo, A. An empirical study on integrating a small humanoid robot to support the therapy of children with Autism Spectrum Disorder and Intellectual Disability. Interact. Stud. 2021 , 22 , 177–211. [ Google Scholar ] [ CrossRef ]
  • Haristiani, N. Artificial Intelligence (AI) chatbot as language learning medium: An inquiry. In Proceedings of the International Conference on Education, Science and Technology, Padang, Indonesia, 13–16 March 2019; p. 012020. [ Google Scholar ]
  • Reis, J. Customer Service Through AI-Powered Human-Robot Relationships: Where are we now? The case of Henn na Cafe, Japan. Technol. Soc. 2024 , 77 , 102570. [ Google Scholar ] [ CrossRef ]
  • Yorita, A.; Egerton, S.; Oakman, J.; Chan, C.; Kubota, N. A robot assisted stress management framework: Using conversation to measure occupational stress. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–10 October 2018; pp. 3761–3767. [ Google Scholar ]
  • Isernhagen, J.C. TeamMates: Providing emotional and academic support in rural schools. Rural Educ. 2010 , 32 , 29–35. [ Google Scholar ]
  • Lee, K.M.; Peng, W.; Jin, S.-A.; Yan, C. Can robots manifest personality?: An empirical test of personality recognition, social responses, and social presence in human–robot interaction. J. Commun. 2006 , 56 , 754–772. [ Google Scholar ] [ CrossRef ]
  • Woods, S.; Dautenhahn, K.; Kaouri, C.; Boekhorst, R.; Koay, K.L. Is this robot like me? Links between human and robot personality traits. In Proceedings of the 5th IEEE-RAS International Conference on Humanoid Robots, Tsukuba, Japan, 5 December 2005; pp. 375–380. [ Google Scholar ]
  • Paetzel-Prüsmann, M.; Perugia, G.; Castellano, G. The influence of robot personality on the development of uncanny feelings. Comput. Hum. Behav. 2021 , 120 , 106756. [ Google Scholar ] [ CrossRef ]
  • Ewald, H.; Klerings, I.; Wagner, G.; Heise, T.L.; Stratil, J.M.; Lhachimi, S.K.; Hemkens, L.G.; Gartlehner, G.; Armijo-Olivo, S.; Nussbaumer-Streit, B. Searching two or more databases decreased the risk of missing relevant studies: A metaresearch study. J. Clin. Epidemiol. 2022 , 149 , 154–164. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Hughes-Roberts, T.; Brown, D.; Standen, P.; Desideri, L.; Negrini, M.; Rouame, A.; Malavasi, M.; Wager, G.; Hasson, C. Examining engagement and achievement in learners with individual needs through robotic-based teaching sessions. Br. J. Educ. Technol. 2019 , 50 , 2736–2750. [ Google Scholar ] [ CrossRef ]
  • Wu, W.-C.V.; Wang, R.-J.; Chen, N.-S. Instructional design using an in-house built teaching assistant robot to enhance elementary school English-as-a-foreign-language learning. Interact. Learn. Environ. 2015 , 23 , 696–714. [ Google Scholar ] [ CrossRef ]
  • Özdemir, D.; Karaman, S. Investigating interactions between students with mild mental retardation and humanoid robot in terms of feedback types. Egit. Ve Bilim 2017 , 42 , 109–138. [ Google Scholar ] [ CrossRef ]
  • Yang, W.; Luo, H.; Su, J. Towards inclusiveness and sustainability of robot programming in early childhood: Child engagement, learning outcomes and teacher perception. Br. J. Educ. Technol. 2022 , 53 , 1486–1510. [ Google Scholar ] [ CrossRef ]
  • Noh, J.; Lee, J. Effects of robotics programming on the computational thinking and creativity of elementary school students. Educ. Technol. Res. Dev. 2020 , 68 , 463–484. [ Google Scholar ] [ CrossRef ]
  • Hong, Z.-W.; Huang, Y.-M.; Hsu, M.; Shen, W.-W. Authoring robot-assisted instructional materials for improving learning performance and motivation in EFL classrooms. J. Educ. Technol. Soc. 2016 , 19 , 337–349. [ Google Scholar ]
  • Engwall, O.; Lopes, J.; Cumbal, R.; Berndtson, G.; Lindström, R.; Ekman, P.; Hartmanis, E.; Jin, E.; Johnston, E.; Tahir, G. Learner and teacher perspectives on robot-led L2 conversation practice. ReCALL 2022 , 34 , 344–359. [ Google Scholar ] [ CrossRef ]
  • Al Hakim, V.G.; Yang, S.-H.; Liyanawatta, M.; Wang, J.-H.; Chen, G.-D. Robots in situated learning classrooms with immediate feedback mechanisms to improve students’ learning performance. Comput. Educ. 2022 , 182 , 104483. [ Google Scholar ] [ CrossRef ]
  • Velentza, A.-M.; Fachantidis, N.; Lefkos, I. Learn with surprize from a robot professor. Comput. Educ. 2021 , 173 , 104272. [ Google Scholar ] [ CrossRef ]
  • Kewalramani, S.; Kidman, G.; Palaiologou, I. Using artificial intelligence (AI)-interfaced robotic toys in early childhood settings: A case for children’s inquiry literacy. Eur. Early Child. Educ. Res. J. 2021 , 29 , 652–668. [ Google Scholar ] [ CrossRef ]
  • Chen Hsieh, J. Digital storytelling outcomes and emotional experience among middle school EFL learners: Robot-assisted versus PowerPoint-assisted mode. TESOL Q. 2021 , 55 , 994–1010. [ Google Scholar ] [ CrossRef ]
  • Mioduser, D.; Levy, S.T.; Talis, V. Episodes to scripts to rules: Concrete-abstractions in kindergarten children’s explanations of a robot’s behavior. Int. J. Technol. Des. Educ. 2009 , 19 , 15–36. [ Google Scholar ] [ CrossRef ]
  • Iio, T.; Maeda, R.; Ogawa, K.; Yoshikawa, Y.; Ishiguro, H.; Suzuki, K.; Aoki, T.; Maesaki, M.; Hama, M. Improvement of Japanese adults’ English speaking skills via experiences speaking to a robot. J. Comput. Assist. Learn. 2019 , 35 , 228–245. [ Google Scholar ] [ CrossRef ]
  • Sen, C.; Ay, Z.S.; Kiray, S.A. Computational thinking skills of gifted and talented students in integrated STEM activities based on the engineering design process: The case of robotics and 3D robot modeling. Think. Ski. Creat. 2021 , 42 , 100931. [ Google Scholar ] [ CrossRef ]
  • Keane, T.; Chalmers, C.; Boden, M.; Williams, M. Humanoid robots: Learning a programming language to learn a traditional language. Technol. Pedagog. Educ. 2019 , 28 , 533–546. [ Google Scholar ] [ CrossRef ]
  • David, D.O.; Costescu, C.A.; Matu, S.; Szentagotai, A.; Dobrean, A. Effects of a robot-enhanced intervention for children with ASD on teaching turn-taking skills. J. Educ. Comput. Res. 2020 , 58 , 29–62. [ Google Scholar ] [ CrossRef ]
  • Mitnik, R.; Nussbaum, M.; Recabarren, M. Developing cognition with collaborative robotic activities. J. Educ. Technol. Soc. 2009 , 12 , 317–330. [ Google Scholar ]
  • Kim, Y.; Marx, S.; Pham, H.V.; Nguyen, T. Designing for robot-mediated interaction among culturally and linguistically diverse children. Educ. Technol. Res. Dev. 2021 , 69 , 3233–3254. [ Google Scholar ] [ CrossRef ]
  • Chen Hsieh, J.; Lee, J.S. Digital storytelling outcomes, emotions, grit, and perceptions among EFL middle school learners: Robot-assisted versus PowerPoint-assisted presentations. Comput. Assist. Lang. Learn. 2023 , 36 , 1088–1115. [ Google Scholar ] [ CrossRef ]
  • Nam, K.W.; Kim, H.J.; Lee, S. Connecting plans to action: The effects of a card-coded robotics curriculum and activities on Korean kindergartners. Asia-Pac. Educ. Res. 2019 , 28 , 387–397. [ Google Scholar ] [ CrossRef ]
  • Merkouris, A.; Chorianopoulou, B.; Chorianopoulos, K.; Chrissikopoulos, V. Understanding the notion of friction through gestural interaction with a remotely controlled robot. J. Sci. Educ. Technol. 2019 , 28 , 209–221. [ Google Scholar ] [ CrossRef ]
  • Konijn, E.A.; Hoorn, J.F. Robot tutor and pupils’ educational ability: Teaching the times tables. Comput. Educ. 2020 , 157 , 103970. [ Google Scholar ] [ CrossRef ]
  • Valente, J.A.; Caceffo, R.; Bonacin, R.; dos Reis, J.C.; Gonçalves, D.A.; Baranauskas, M.C.C. Embodied-based environment for kindergarten children: Revisiting constructionist ideas. Br. J. Educ. Technol. 2021 , 52 , 986–1003. [ Google Scholar ] [ CrossRef ]
  • Evripidou, S.; Amanatiadis, A.; Christodoulou, K.; Chatzichristofis, S.A. Introducing algorithmic thinking and sequencing using tangible robots. IEEE Trans. Learn. Technol. 2021 , 14 , 93–105. [ Google Scholar ] [ CrossRef ]
  • Mazzoni, E.; Benvenuti, M. A robot-partner for preschool children learning English using socio-cognitive conflict. J. Educ. Technol. Soc. 2015 , 18 , 474–485. [ Google Scholar ]
  • Lee, S.; Noh, H.; Lee, J.; Lee, K.; Lee, G.G.; Sagong, S.; Kim, M. On the effectiveness of robot-assisted language learning. ReCALL 2011 , 23 , 25–58. [ Google Scholar ] [ CrossRef ]
  • Kim, Y.; Tscholl, M. Young children’s embodied interactions with a social robot. Educ. Technol. Res. Dev. 2021 , 69 , 2059–2081. [ Google Scholar ] [ CrossRef ]
  • Shumway, J.F.; Welch, L.E.; Kozlowski, J.S.; Clarke-Midura, J.; Lee, V.R. Kindergarten students’ mathematics knowledge at work: The mathematics for programming robot toys. Math. Think. Learn. 2023 , 25 , 380–408. [ Google Scholar ] [ CrossRef ]
  • Liao, J.; Lu, X. Exploring the affordances of telepresence robots in foreign language learning. Lang. Learn. Technol. 2018 , 22 , 20–32. [ Google Scholar ]
  • Hsiao, H.-S.; Chang, C.-S.; Lin, C.-Y.; Hsu, H.-L. “iRobiQ”: The influence of bidirectional interaction on kindergarteners’ reading motivation, literacy, and behavior. Interact. Learn. Environ. 2015 , 23 , 269–292. [ Google Scholar ] [ CrossRef ]
  • Çakır, R.; Korkmaz, Ö.; İdil, Ö.; Erdoğmuş, F.U. The effect of robotic coding education on preschoolers’ problem solving and creative thinking skills. Think. Ski. Creat. 2021 , 40 , 100812. [ Google Scholar ] [ CrossRef ]
  • Chernyak, N.; Gary, H.E. Children’s cognitive and behavioral reactions to an autonomous versus controlled social robot dog. In Young Children’s Developing Understanding of the Biological World ; Routledge: London, UK, 2019; pp. 73–90. [ Google Scholar ]
  • Yang, W.; Ng, D.T.K.; Gao, H. Robot programming versus block play in early childhood education: Effects on computational thinking, sequencing ability, and self-regulation. Br. J. Educ. Technol. 2022 , 53 , 1817–1841. [ Google Scholar ] [ CrossRef ]
  • Resing, W.C.; Vogelaar, B.; Elliott, J.G. Children’s solving of ‘Tower of Hanoi’tasks: Dynamic testing with the help of a robot. Educ. Psychol. 2020 , 40 , 1136–1163. [ Google Scholar ] [ CrossRef ]
  • Brainin, E.; Shamir, A.; Eden, S. Robot programming intervention for promoting spatial relations, mental rotation and visual memory of kindergarten children. J. Res. Technol. Educ. 2022 , 54 , 345–358. [ Google Scholar ] [ CrossRef ]
  • Benvenuti, M.; Mazzoni, E. Enhancing wayfinding in pre-school children through robot and socio-cognitive conflict. Br. J. Educ. Technol. 2020 , 51 , 436–458. [ Google Scholar ] [ CrossRef ]
  • Pop, C.A.; Simut, R.E.; Pintea, S.; Saldien, J.; Rusu, A.S.; Vanderfaeillie, J.; David, D.O.; Lefeber, D.; Vanderborght, B. Social robots vs. computer display: Does the way social stories are delivered make a difference for their effectiveness on ASD children? J. Educ. Comput. Res. 2013 , 49 , 381–401. [ Google Scholar ] [ CrossRef ]
  • Chevalier, M.; Giang, C.; Piatti, A.; Mondada, F. Fostering computational thinking through educational robotics: A model for creative computational problem solving. Int. J. STEM Educ. 2020 , 7 , 39. [ Google Scholar ] [ CrossRef ]
  • Chew, E.; Chua, X.N. Robotic Chinese language tutor: Personalising progress assessment and feedback or taking over your job? Horiz. 2020 , 28 , 113–124. [ Google Scholar ] [ CrossRef ]
  • Pérez-Marín, D.; Hijón-Neira, R.; Pizarro, C. Coding in early years education: Which factors influence the skills of sequencing and plotting a route, and to what extent? Int. J. Early Years Educ. 2022 , 30 , 969–985. [ Google Scholar ] [ CrossRef ]
  • Bravo, F.A.; Hurtado, J.A.; González, E. Using robots with storytelling and drama activities in science education. Educ. Sci. 2021 , 11 , 329. [ Google Scholar ] [ CrossRef ]
  • Demir-Lira, Ö.E.; Kanero, J.; Oranç, C.; Koşkulu, S.; Franko, I.; Göksun, T.; Küntay, A.C. L2 Vocabulary Teaching by Social Robots: The Role of Gestures and On-Screen Cues as Scaffolds ; Frontiers in Education, Frontiers Media SA: Lausanne, Switzerland, 2020; p. 599636. [ Google Scholar ]
  • Alemi, M.; Bahramipour, S. An innovative approach of incorporating a humanoid robot into teaching EFL learners with intellectual disabilities. Asian-Pac. J. Second Foreign Lang. Educ. 2019 , 4 , 10. [ Google Scholar ] [ CrossRef ]
  • Cherniak, S.; Lee, K.; Cho, E.; Jung, S.E. Child-identified problems and their robotic solutions. J. Early Child. Res. 2019 , 17 , 347–360. [ Google Scholar ] [ CrossRef ]
  • Silva, R.; Fonseca, B.; Costa, C.; Martins, F. Fostering computational thinking skills: A didactic proposal for elementary school grades. Educ. Sci. 2021 , 11 , 518. [ Google Scholar ] [ CrossRef ]
  • Arar, C.; Belazoui, A.; Telli, A. Adoption of social robots as pedagogical aids for efficient learning of second language vocabulary to children. J. E-Learn. Knowl. Soc. 2021 , 17 , 119–126. [ Google Scholar ]
  • Khalifa, A.; Kato, T.; Yamamoto, S. Learning Effect of Implicit Learning in Joining-in-type Robot-assisted Language Learning System. Int. J. Emerg. Technol. Learn. 2019 , 14 , 105–123. [ Google Scholar ] [ CrossRef ]
  • Hall, J.A.; McCormick, K.I. “My Cars don’t Drive Themselves”: Preschoolers’ Guided Play Experiences with Button-Operated Robots. TechTrends 2022 , 66 , 510–526. [ Google Scholar ] [ CrossRef ]
  • Tolksdorf, N.F.; Crawshaw, C.E.; Rohlfing, K.J. Comparing the Effects of a Different Social Partner (Social Robot vs. Human) on Children’s Social Referencing in Interaction ; Frontiers in Education, Frontiers Media SA: Lausanne, Switzerland, 2021; p. 569615. [ Google Scholar ]
  • Ferrarelli, P.; Iocchi, L. Learning Newtonian physics through programming robot experiments. Technol. Knowl. Learn. 2021 , 26 , 789–824. [ Google Scholar ] [ CrossRef ]
  • Welch, L.E.; Shumway, J.F.; Clarke-Midura, J.; Lee, V.R. Exploring measurement through coding: Children’s conceptions of a dynamic linear unit with robot coding toys. Educ. Sci. 2022 , 12 , 143. [ Google Scholar ] [ CrossRef ]
  • Keller, L.; John, I. Motivating female students for computer science by means of robot workshops. Int. J. Eng. Pedagog. 2020 , 10 , 94. [ Google Scholar ] [ CrossRef ]
  • Paucar-Curasma, R.; Villalba-Condori, K.; Arias-Chavez, D.; Le, N.-T.; Garcia-Tejada, G.; Frango-Silveira, I. Evaluation of Computational Thinking using four educational robots with primary school students in Peru. Educ. Knowl. Soc. 2022 , 23 . [ Google Scholar ] [ CrossRef ]
  • Urlings, C.C.; Coppens, K.M.; Borghans, L. Measurement of executive functioning using a playful robot in kindergarten. Comput. Sch. 2019 , 36 , 255–273. [ Google Scholar ] [ CrossRef ]
  • So, W.-C.; Wong, M.K.-Y.; Lam, W.-Y.; Cheng, C.-H.; Ku, S.-Y.; Lam, K.-Y.; Huang, Y.; Wong, W.-L. Who is a better teacher for children with autism? Comparison of learning outcomes between robot-based and human-based interventions in gestural production and recognition. Res. Dev. Disabil. 2019 , 86 , 62–75. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Liang, J.-C.; Hwang, G.-J. A robot-based digital storytelling approach to enhancing EFL learners’ multimodal storytelling ability and narrative engagement. Comput. Educ. 2023 , 201 , 104827. [ Google Scholar ] [ CrossRef ]
  • Peura, L.; Mutta, M.; Johansson, M. Playing with pronunciation: A study on robot-assisted French pronunciation in a learning game. Nord. J. Digit. Lit. 2023 , 2 , 100–115. [ Google Scholar ] [ CrossRef ]
  • Veivo, O.; Mutta, M. Dialogue breakdowns in robot-assisted L2 learning. Comput. Assist. Lang. Learn. 2022 , 1–22. [ Google Scholar ] [ CrossRef ]
  • Chung, E.Y.-H. Robotic intervention program for enhancement of social engagement among children with autism spectrum disorder. J. Dev. Phys. Disabil. 2019 , 31 , 419–434. [ Google Scholar ] [ CrossRef ]
  • Chang, C.-Y.; Hwang, G.-J.; Chou, Y.-L.; Xu, Z.-Y.; Jen, H.-J. Effects of robot-assisted digital storytelling on hospitalized children’s communication during the COVID-19 pandemic. Educ. Technol. Res. Dev. 2023 , 71 , 793–805. [ Google Scholar ] [ CrossRef ]
  • Kalmpourtzis, G.; Romero, M. An affordance-based framework for the design and analysis of learning activities in playful educational robotics contexts. Interact. Learn. Environ. 2022 , 1–14. [ Google Scholar ] [ CrossRef ]
  • Saadatzi, M.N.; Pennington, R.C.; Welch, K.C.; Graham, J.H. Effects of a robot peer on the acquisition and observational learning of sight words in young adults with autism spectrum disorder. J. Spec. Educ. Technol. 2018 , 33 , 284–296. [ Google Scholar ] [ CrossRef ]
  • Chiang, Y.-H.V.; Cheng, Y.-W.; Chen, N.-S. Improving language learning activity design through identifying learning difficulties in a platform using educational robots and IoT-based tangible objects. Educ. Technol. Soc. 2023 , 26 , 84–100. [ Google Scholar ]
  • Cheng, Y.-W.; Wang, Y.; Cheng, Y.-J.; Chen, N.-S. The impact of learning support facilitated by a robot and IoT-based tangible objects on children’s game-based language learning. Comput. Assist. Lang. Learn. 2022 , 1–32. [ Google Scholar ] [ CrossRef ]
  • Sabena, C. Early child spatial development: A teaching experiment with programmable robots. In Mathematics and Technology ; Springer: Berlin/Heidelberg, Germany, 2017; pp. 13–30. [ Google Scholar ]
  • Kwon, K.; Jeon, M.; Zhou, C.; Kim, K.; Brush, T.A. Embodied learning for computational thinking in early primary education. J. Res. Technol. Educ. 2022 , 1–21. [ Google Scholar ] [ CrossRef ]
  • Chen, S.; Qiu, S.; Li, H.; Zhang, J.; Wu, X.; Zeng, W.; Huang, F. An integrated model for predicting pupils’ acceptance of artificially intelligent robots as teachers. Educ. Inf. Technol. 2023 , 28 , 11631–11654. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Hsieh, W.-M.; Yeh, H.-C.; Chen, N.-S. Impact of a robot and tangible object (R&T) integrated learning system on elementary EFL learners’ English pronunciation and willingness to communicate. Comput. Assist. Lang. Learn. 2023 , 1–26. [ Google Scholar ] [ CrossRef ]
  • Angeli, C.; Georgiou, K. Investigating the Effects of Gender and Scaffolding in Developing Preschool Children’s Computational Thinking during Problem-Solving with Bee-Bots ; Frontiers in Education, Frontiers Media SA: Lausanne, Switzerland, 2023; p. 757627. [ Google Scholar ]
  • Kim, Y.; Hwang, J.; Lim, S.; Cho, M.-H.; Lee, S. Child–robot interaction: Designing robot mediation to facilitate friendship behaviors. Interact. Learn. Environ. 2023 , 1–14. [ Google Scholar ] [ CrossRef ]
  • Bargagna, S.; Castro, E.; Cecchi, F.; Cioni, G.; Dario, P.; Dell’Omo, M.; Di Lieto, M.C.; Inguaggiato, E.; Martinelli, A.; Pecini, C. Educational robotics in down syndrome: A feasibility study. Technol. Knowl. Learn. 2019 , 24 , 315–323. [ Google Scholar ] [ CrossRef ]
  • Cervera, N.; Diago, P.D.; Orcos, L.; Yáñez, D.F. The acquisition of computational thinking through mentoring: An exploratory study. Educ. Sci. 2020 , 10 , 202. [ Google Scholar ] [ CrossRef ]
  • So, W.-C.; Cheng, C.-H.; Lam, W.-Y.; Wong, T.; Law, W.-W.; Huang, Y.; Ng, K.-C.; Tung, H.-C.; Wong, W. Robot-based play-drama intervention may improve the narrative abilities of Chinese-speaking preschoolers with autism spectrum disorder. Res. Dev. Disabil. 2019 , 95 , 103515. [ Google Scholar ] [ CrossRef ] [ PubMed ]
Inclusion CriteriaExclusion Criteria
Research must use physical robots.Research papers from conference proceedings, book chapters, magazines, news, and posters are excluded.
Research must report on the effectiveness of the robot in the actual teaching and learning process.Incomplete studies were excluded, for example, studies that reported only on the development and design of robotic software or systems but not on empirical results.
Research must be published in peer reviewed journals.Empirical research that merely used self-report data collections, such as interviews or surveys, is excluded.
Research must be reported as an empirical study to demonstrate the actual effectiveness of the robot in an educational setting.Research on building robots in programming courses is not included.
Research must be reported in English.Studies of faculty and student perceptions of robots were not included.
Full text is available.
ComponentsDimensionsCoding ItemsReferences
CKLearning domainLanguages; engineering or computers; science; health, medical or nursing; social science or social studies; business and management; arts or design; mathematicsHwang and Chang [ ]
PKTeaching strategyPractice on specific learning material, physical interaction, communication, role play, and collaborative language learningEngwall and Lopes [ ]
TKRobot typesToy-like robots, face or belly screen robots, humanoid robots, robotic heads, and programmable robotsEngwall and Lopes [ ]
TCKLearning resultsCognitive, behavioral, and affectiveAlbarracin, Hepler [ ]
TPKProblems with using robotsAnalyze the problem from 3 perspectives: teacher, student, and robot.Huang, Hew [ ]
PCKRobotic supportInformation support, information support, and emotional supportLeite, Castellano [ ]
TPCKRobotic personalityOpenness, conscientiousness, extroversion, agreeableness, and neuroticismDiener and Lucas [ ]
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

Wang, H.; Luo, N.; Zhou, T.; Yang, S. Physical Robots in Education: A Systematic Review Based on the Technological Pedagogical Content Knowledge Framework. Sustainability 2024 , 16 , 4987. https://doi.org/10.3390/su16124987

Wang H, Luo N, Zhou T, Yang S. Physical Robots in Education: A Systematic Review Based on the Technological Pedagogical Content Knowledge Framework. Sustainability . 2024; 16(12):4987. https://doi.org/10.3390/su16124987

Wang, Huayi, Ningfeng Luo, Tong Zhou, and Shuai Yang. 2024. "Physical Robots in Education: A Systematic Review Based on the Technological Pedagogical Content Knowledge Framework" Sustainability 16, no. 12: 4987. https://doi.org/10.3390/su16124987

Article Metrics

Further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Stereo digital image correlation (3D-DIC) for non-contact measurement and refinement of delta robot arm displacement and jerk

  • ORIGINAL ARTICLE
  • Published: 11 June 2024

Cite this article

literature review of robotic arm

  • Shih-Hao Lin 1 ,
  • En-Tze Chen 1 ,
  • Jia-Jun Xiao 1 &
  • Ching-Yuan Chang   ORCID: orcid.org/0000-0003-4157-5292 1  

This study proposes using the stereo digital image correlation (3D-DIC) method to measure the delta robot arm displacement and jerk to verify the robot arm end-effector position and improve the absolute accuracy. The combination of the cameras and robotic arm is eye-to-hand. The study’s contribution is that the camera is a sensor for non-contact and full-field measurement. The non-contact sensor avoids changing the object’s motion state due to contact. The two economic cameras used in this study cost about 500 USD. We obtain the external and internal parameters between the two cameras by calibrating the camera pinhole model and the chessboard pattern. A spherical feature is set up on the delta robotic arm end-effector. This study used the Hough circle transform and 3D-DIC to trace the spherical center. The end-effector displacement is compared by 3D-DIC and the laser displacement meter. The acceleration and jerk are compared by 3D-DIC and the accelerometer. This study ultimately verified that the displacement error is less than 2%. The peak values of acceleration and jerk are close to the results measured by the accelerometer. Applying this technique to 3D-printed products can reduce defects caused by excessive speed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

literature review of robotic arm

Data availability

Shared on the link until 12/31/2024: https://1drv.ms/f/s!AqGlWWzEEC3dkDUDuA88BgEfLozJ?e=7bQAay .

Rosenberger P, Cosgun A, Newbury R, Kwan J, Ortenzi V, Corke P, Grafinger M (2020) Object-independent human-to-robot handovers using real time robotic vision. IEEE Robot Autom Lett 6(1):17–23

Article   Google Scholar  

Morrison D, Corke P, Leitner J (2020) Learning robust, real-time, reactive robotic grasping. Int J Robot Res 39(2–3):183–201

Li C-HG, Chang Y-M (2019) Automated visual positioning and precision placement of a workpiece using deep learning. Int J Adv Manuf Technol 104(9–12):4527–4538

Li THS, Kuo PH, Ho YF, Liou GH (2019) Intelligent control strategy for robotic arm by using adaptive inertia weight and acceleration coefficients particle swarm optimization. IEEE Access 7:126929–126940

Han L, Xu W, Li B, Kang P (2019) Collision detection and coordinated compliance control for a dual-arm robot without force/torque sensing based on momentum observer. IEEE/ASME Trans Mechatron 24(5):2261–2272

Caon A, Branz F, Francesconi A (2022) Development and test of a robotic arm for experiments on close proximity operations. Acta Astronaut 195:287–294

Margerit P, Gobin T, Lebee A, Caron J-F (2021) The robotized laser doppler vibrometer: on the use of an industrial robot arm to perform 3D full-field velocity measurements. Opt Lasers Eng 137:106363

Boby RA (2022) Kinematic identification of industrial robot using end-effector mounted monocular camera bypassing measurement of 3-D pose. IEEE/ASME Trans Mechatron 27(1):383–394

Chen L, Zhong G, Wan Z, Han Z, Liang X, Pan H (2023) A novel binocular vision-robot hand-eye calibration method using dual nonlinear optimization and sample screening. Mechatronics 96:103083

Du YC, Taryudi T, Tsai CT, Wang MS (2021) Eye-to-hand robotic tracking and grabbing based on binocular vision. Microsyst Technol 27:1699–1710

Muthusamy R, Ayyad A, Halwani M, Swart D, Gan D, Seneviratne L, Zweiri Y (2021) Neuromorphic eye-in-hand visual servoing. IEEE. Access 9:55853–55870

Du Y-C, Muslikhin M, Hsieh T-H, Wang M-S (2020) Stereo vision-based object recognition and manipulation by regions with convolutional neural network. Electronics 9(2):210

Bilal DK, Unel M, Tunc LT, Gonul B (2022) Development of a vision based pose estimation system for robotic machining and improving its accuracy using LSTM neural networks and sparse regression. Robot Comput-Integr Manuf 74:102262

Ma W-P, Li W-X, Cao P-X (2020) Binocular vision object positioning method for robots based on coarse-fine stereo matching. Int J Autom Comput 17(4):562–571

Chang CY, Huang CW (2020) Non-contact measurement of inter-story drift in three-layer RC structure under seismic vibration using digital image correlation. Mech Syst Signal Process 136:106500

Pan B, Yu L, Zhang Q (2018) Review of single-camera stereo-digital image correlation techniques for full-field 3D shape and deformation measurement. Sci Chin Technol Sci 61(1):2–20

Wang L, Chen B, Xu P, Ren H, Fang X, Wan S (2020) Geometry consistency aware confidence evaluation for feature matching. Image Vis Comput 103:103984

Fan X, Xing L, Chen J, Chen S, Bai H, Xing L, Yang Y (2022) VLSG-SANet: a feature matching algorithm for remote sensing image registration. Knowl-Based Syst 255:109609

Yu L, Pan B (2021) Overview of high-temperature deformation measurement using digital image correlation. Exp Mech 61(7):1121–1142

Gehri N, Mata-Falcón J, Kaufmann W (2020) Automated crack detection and measurement based on digital image correlation. Constr Build Mater 256:119383

Download references

The authors received financial support for this research from the Ministry of Science and Technology (Republic of China) under Grant MOST 111–2221-E-027 -124 -MY2.

Author information

Authors and affiliations.

Department of Mechanical Engineering, National Taipei University of Technology, Taipei, 10617, Taiwan, ROC

Shih-Hao Lin, En-Tze Chen, Jia-Jun Xiao & Ching-Yuan Chang

You can also search for this author in PubMed   Google Scholar

Contributions

Ching-Yuan Chang contributes to the mathematical model. En-Tze Chen contributes to the experimental setup. Jia-Jun Xiao contributes to the data collection. Shih-Hao Lin contributes to the data collection data analysis.

Corresponding author

Correspondence to Ching-Yuan Chang .

Ethics declarations

Consent for publication.

The grant’s policy encourages researchers to broaden industrial automation applications and to publish the latest research in a reputable journal.

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

It is a novel contribution to the scientific literature that has not been published elsewhere or simultaneously in whole or part.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Lin, SH., Chen, ET., Xiao, JJ. et al. Stereo digital image correlation (3D-DIC) for non-contact measurement and refinement of delta robot arm displacement and jerk. Int J Adv Manuf Technol (2024). https://doi.org/10.1007/s00170-024-13957-2

Download citation

Received : 05 March 2024

Accepted : 29 May 2024

Published : 11 June 2024

DOI : https://doi.org/10.1007/s00170-024-13957-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Stereo digital image correlation (3D-DIC)
  • End-effector position
  • Eye-to-hand
  • Non-contact measurement
  • Hough circle transform
  • Displacement error
  • Find a journal
  • Publish with us
  • Track your research

COMMENTS

  1. (PDF) DEVELOPMENT OF A ROBOT ARM: A REVIEW

    manipulator, a robot arm, (according to Park and Lynch, (2016)), has an arm that en sures mobility, a wrist that. confers dexterity, and a n end-effector that performs the task. required of the ...

  2. Current Designs of Robotic Arm Grippers: A Comprehensive Systematic Review

    The ability to grip and manipulate objects has been central to the advancement of robots [1,2,3,4,5,6,7,8,9,10].Manufacturers can use end-effector tooling for picking, placing, and packing objects using advances in gripper technology to reap the benefits of precision, performance, and productivity [].Grippers are classified depending on their design, how they are powered, and their application.

  3. PDF Review on Design and Development of Robotic Arm Generation-1"

    LITERATURE REVIEW A survey on Arduino Controlled Robotic Arm by Ankur Bhargava In this paper a 5 Degree of Freedom (DOF) robotic arm have been developed. ... robotic arm kinematics with hardware design, Electrical design and implementation, Journal of robotics, 2010, Volume 10. [2]. Rahman A, Khan A. H, Dr. Ahmed T, Md Sajjad M,

  4. Robotic arms in precision agriculture: A comprehensive review of the

    Robotic arms used in paddy fields, as part of agricultural machinery, execute tasks such as weeding, and feature designs that enable flexible movement in muddy terrains. Sori et al. (2018) introduced a weeding robot designed to navigate through paddy fields and traverse across rice seedlings (Fig. 16 (c)). This robot employed a similar ...

  5. Design and analysis of a robotic arm under different loading conditions

    A robotic arm is a manipulator, usually works similar to the human arm. The robotic arm's links are connected to the joints where rotational motion is provided. This connection is considered to form a kinematic chain. The end of the kinematic chain is called an end effector, and it is similar to a human hand.

  6. An Automated Robotic Arm: A Machine Learning Approach

    utilized to design and develop the robotic arm as displayed in the Fig. 2 [14]. 1) Robotic Arm: A robotic arm (Fig. 2) is a sort of arm which may be mechanical, and can be programmed with capabilities almost equivalent to an actual arm. The robotic arm can be aggregate of a component or a piece of a more advanced robot.

  7. Recent advancements of robotics in construction

    A mixed-method review of literature of robotics in construction since 21 century. ... Industrial robotic arms are highly flexible and can be configured for complex construction tasks. In this approach, the motion plan is designed to reduce positioning errors caused by frequent robot base movement and avoid collisions.

  8. SIX DEGREE ROBOTIC ARM WITH MIMICKING MECHANISM

    implementation of a 6 DOF robotic arm, which should perform industrial task such as pick and place of fragile objects operation. This robot arm being ... LITERATURE SURVEY 3.1 Literature Review. 3.1.1 The Concept Of Degrees Of Freedom 3.1.2 Design and Development of 6-DOF Robotic Arm Controlled by Man Machine Interface. ...

  9. Biomimetic Approaches for Human Arm Motion Generation: Literature

    Based on our literature review, the overall research comprised approx. 51% robotic manipulators, 30% humanoid robots, 17% musculoskeletal robots, and 2% hyper-redundant robots. It was evident from the literature that humanoids and seven degree of freedom manipulators served as the ideal platforms for testing biomimetic algorithms due to their ...

  10. A Review of Current Techniques for Robotic Arm Manipulation and Mobile

    Robotic manipulators are widely used in the field of robotics. In this paper, four different manipulators are reviewed, and their use in robotic applications is presented. These robotic implementations and applications continue to evolve to meet an ever-growing industrial need for improved efficiency and functionality. This is a work in progress and the next step of this research is to use the ...

  11. A Review of Motion Planning Algorithms for Robotic Arm Systems

    This article mainly conducts literature review and research on some aspects of motion planning of robotic arm within the most recent 5 years. We searched for the relevant keywords from the four main academic searching engines such as Web of Science, IEEE, Scopus and Google Scholar.

  12. Evolution of robotic arms

    This is a thorough review of the literature on the nature and development of this device with emphasis on surgical applications. We have reviewed the published literature and classified robotic arms by their application: show, industrial application, medical application, etc. There is a definite trend in the manufacture of robotic arms toward ...

  13. Human-Like Arm Motion Generation: A Review

    In the last decade, the objectives outlined by the needs of personal robotics have led to the rise of new biologically-inspired techniques for arm motion planning. This paper presents a literature review of the most recent research on the generation of human-like arm movements in humanoid and manipulation robotic systems. Search methods and inclusion criteria are described. The studies are ...

  14. PDF A Review of Motion Planning Algorithms for Robotic Arm Systems

    ple robot arms need to complete a job at the same time, the mutual obstacle avoidance between the robot arms also needs to be considered. In order to ensure that the robot ... literature review and research on some aspects of motion planning of robotic arm within the most recent 5 years. We searched for the relevant keywords from the four

  15. A Review on Cooperative Robotic Arms with Mobile or Drones Bases

    This review paper focuses on cooperative robotic arms with mobile or drone bases performing cooperative tasks. This is because cooperative robots are often used as risk-reduction tools to human life. For example, they are used to explore dangerous places such as minefields and disarm explosives. Drones can be used to perform tasks such as aerial photography, military and defense missions ...

  16. A Review of Spatial Robotic Arm Trajectory Planning

    The literature summarized the history of successful robot manipulators such as the Shuttle Remote Manipulator System (SRMS) and Space Station Remote Manipulator System (SSRMS). It includes the most famous Canadian arm, Canadarm2, the Japanese Experiment Module (JEM) Remote Manipulator System, and the European Robotic Arm (ERA). ... "A Review of ...

  17. PDF A Review on Design and Fabrication of Robotic Arm

    degrees of freedom robot arm mechanism based on the type of end effector attached to the robot arm. 3.5. Review on design and development of intelligent robotic arm By Netra Barai, Swati Manekar in October 2015 [5]: ThisIndustrial automation requires a large number of machines to perform the same number of actions repeatedly.

  18. The AI revolution is coming to robots: how will it change them?

    The term robot covers a wide range of automated devices, from the robotic arms widely used in manufacturing, to self-driving cars and drones used in warfare and rescue missions. Most incorporate ...

  19. Evolution of robotic arms

    The foundation of surgical robotics is in the development of the robotic arm. This is a thorough review of the literature on the nature and development of this device with emphasis on surgical applications. We have reviewed the published literature and classified robotic arms by their application: show, industrial application, medical application, etc. There is a definite trend in the ...

  20. PDF Pick and Place Robotic Arm: A Review Paper

    2 Fig 4: Configuration of Mbed Literature Review 2 Literature Review 2.1 Development Of Robotic Arm Using Arduino UNO by 1Priyambada Mishra,2Riki Patel, 2Trushit Upadhyaya, 2Arpan Desai:- In this paper, they have used 4 servo motors to make joints of the robotic arm and the movement will be controlled with the help of potentiometer.

  21. Physical Robots in Education: A Systematic Review Based on the ...

    Driven by the wave of artificial intelligence, the educational practice and application of robots have become increasingly common. Despite extensive coverage in the literature on various aspects of educational robots, there are still unexplored avenues, particularly regarding robotic support, robotic personality, and challenges in their applications. This study presented a systematic review of ...

  22. Stereo digital image correlation (3D-DIC) for non-contact ...

    This study proposes using the stereo digital image correlation (3D-DIC) method to measure the delta robot arm displacement and jerk to verify the robot arm end-effector position and improve the absolute accuracy. The combination of the cameras and robotic arm is eye-to-hand. The study's contribution is that the camera is a sensor for non-contact and full-field measurement. The non-contact ...