Robotic Telescope


Robotic telescopes are complicated systems that typically incorporate a number of subsystems. These subsystems contain devices that offer telescope pointing capability, operation of the detector (generally a CCD camera), manage of the dome or telescope enclosure, handle more than the telescope’s focuser, detection of climate conditions, and other capabilities. Often these varying subsystems are presided over by a master handle system, which is almost usually a software program component.

Robotic telescopes operate under closed loop or open loop principles. In an open loop system, a robotic telescope program points itself and collects its information with out inspecting the outcomes of its operations to ensure it is operating appropriately. An open loop telescope is often mentioned to be operating on faith, in that if anything goes incorrect, there is no way for the manage method to detect it and compensate.

A closed loop method has the capability to evaluate its operations via redundant inputs to detect errors. A common such input would be position encoders on the telescope’s axes of motion, or the capability of evaluating the system’s images to ensure it was pointed at the correct field of view when they have been exposed.

Most robotic telescopes are small telescopes. Although big observatory instruments might be hugely automated, couple of are operated with out attendants.

History of professional robotic telescopes

Robotic telescopes had been initial developed by astronomers soon after electromechanical interfaces to computers became widespread at observatories. Early examples were high-priced, had limited capabilities, and incorporated a huge number of exclusive subsystems, each in hardware and software program. This contributed to a lack of progress in the development of robotic telescopes early in their history.

By the early 1980s, with the availability of low-cost computer systems, several viable robotic telescope projects have been conceived, and a few were created. The 1985 book, Microcomputer Control of Telescopes, by Mark Trueblood and Russell M. Genet, was a landmark engineering study in the field. A single of this book’s achievements was pointing out a lot of motives, some very subtle, why telescopes could not be reliably pointed using only basic astronomical calculations. The ideas explored in this book share a common heritage with the telescope mount error modeling software known as Tpoint, which emerged from the very first generation of large automated telescopes in the 1970s, notably the 3.9m Anglo-Australian Telescope.

Since the late 1980s, the University of Iowa has been in the forefront of robotic telescope development on the professional side. The Automated Telescope Facility (ATF), developed in the early 1990s, was positioned on the roof of the physics creating at the University of Iowa in Iowa City. They went on to total the Iowa Robotic Observatory, a robotic and remote telescope at the private Winer Observatory in 1997. This system effectively observed variable stars and contributed observations to dozens of scientific papers. In May possibly 2002, they completed the Rigel Telescope. The Rigel was a .37-meter (14.five-inch) F/14 built by Optical Mechanics, Inc. and controlled by the Talon program. Every single of these was a progression toward a far more automated and utilitarian observatory.

A single of the biggest present networks of robotic telescopes is RoboNet, operated by a consortium of UK universities. The Lincoln Near-Earth Asteroid Research (LINEAR) Project is yet another instance of a professional robotic telescope. LINEAR’s competitors, the Lowell Observatory Close to-Earth-Object Search, Catalina Sky Survey, Spacewatch, and others, have also created varying levels of automation.

In 2002, the Speedy Telescopes for Optical Response (RAPTOR) project pushed the envelope of automated robotic astronomy by becoming the first completely autonomous closedoop robotic telescope. RAPTOR was created in 2000 and began full deployment in 2002. Its 1st light on 1 of the wide field instruments was in late 2001, with the second wide field method coming on the web in early 2002. Closed loop operations began in 2002. Initially the purpose of RAPTOR was to develop a program of ground-based telescopes that would reliably respond to satellite triggers and more importantly, determine transients in true-time and produce alerts with source places to allow adhere to-up observations with other, bigger, telescopes. It has achieved both of these goals really effectively. Now RAPTOR has been re-tuned to be the essential hardware element of the Considering Telescopes Technologies Project. Its new mandate will be the monitoring of the night sky hunting for interesting and anomalous behaviors in persistent sources using some of the most sophisticated robotic computer software ever deployed. The two wide field systems are a mosaic of CCD cameras. The mosaic covers and region of around 1500 square degrees to a depth of 12th magnitude. Centered in each wide field array is a single fovea program with a field of view of four degrees and depth of 16th magnitude. The wide field systems are separated by a 38km baseline. Supporting these wide field systems are two other operational telescopes. The initial of these is a cataloging patrol instrument with a mosaic 16 square degree field of view down to 16 magnitude. The other technique is a .4m OTA with a yielding a depth of 19-20th magnitude and a coverage of .35 degrees. 3 further systems are at the moment undergoing improvement and testing and deployment will be staged more than the next two years. All of the systems are mounted on custom manufactured, rapidly-slewing mounts capable of reaching any point in the sky in 3 seconds. The RAPTOR Method is positioned on internet site at Los Alamos National Laboratory (USA) and has been supported through the Laboratory’s Directed Investigation and Development funds.

In 2004, some expert robotic telescopes had been characterized by a lack of design and style creativity and a reliance on closed supply and proprietary computer software. The software is usually distinctive to the telescope it was made for and can’t be utilised on any other system. Frequently, robotic telescope software created at universities becomes not possible to keep and eventually obsolete because the graduate students who wrote it move on to new positions, and their institutions drop their understanding. Massive telescope consortia or government funded laboratories don’t have a tendency to have this exact same loss of developers as experienced by universities. Expert systems typically function really high observing efficiency and reliability. There is also an growing tendency to adopt ASCOM technology at a couple of professional facilities (see following section). The require for proprietary computer software is normally driven by the competitors for research dollars among institutions.

History of amateur robotic telescopes

In 2004, most robotic telescopes are in the hands of amateur astronomers. A prerequisite for the explosion of amateur robotic telescopes was the availability of reasonably low-cost CCD cameras, which appeared on the industrial marketplace in the early 1990s. These cameras not only allowed amateur astronomers to make pleasing pictures of the evening sky, but also encouraged far more sophisticated amateurs to pursue research projects in cooperation with specialist astronomers. The main motive behind the development of amateur robotic telescopes has been the tedium of creating study-oriented astronomical observations, such as taking endlessly repetitive photos of a variable star.

In 1998, Bob Denny conceived of a software program interface normal for astronomical gear, based on Microsoft’s Component Object Model, which he named the Astronomy Typical Object Model (ASCOM). He also wrote and published the initial examples of this regular, in the form of industrial telescope handle and image evaluation applications, and many freeware components. He also convinced Doug George to incorporate ASCOM capability into a commercial camera handle computer software system. Through this technology, a master control technique that integrated these applications could easily be written in perl, VBScript, or JavaScript. A sample script of that nature was provided by Denny.

Following coverage of ASCOM in Sky &amp Telescope magazine a number of months later, ASCOM architects such as Bob Denny, Doug George, Tim Long, and other individuals later influenced ASCOM into becoming a set of codified interface standards for freeware device drivers for telescopes, CCD cameras, telescope focusers, and astronomical observatory domes. As a outcome amateur robotic telescopes have turn out to be increasingly far more sophisticated and reputable, although application expenses have plunged. ASCOM has also been adopted for some skilled robotic telescopes.

Meanwhile, ASCOM customers created ever far more capable master manage systems. Papers presented at the Minor Planet Amateur-Specialist Workshops (MPAPW) in 1999, 2000, and 2001 and the International Amateur-Professional Photoelectric Photometry Conferences of 1998, 1999, 2000, 2001, 2002, and 2003 documented increasingly sophisticated master manage systems. Some of the capabilities of these systems included automatic selection of observing targets, the potential to interrupt observing or rearrange observing schedules for targets of chance, automatic selection of guide stars, and sophisticated error detection and correction algorithms.

Remote telescope method improvement began in 1999, with very first test runs on genuine telescope hardware in early 2000. RTS2 was primary intended for Gamma ray burst stick to-up observations, so ability to interrupt observation was core component of its style. During development, it became an integrated observatory management suite. Other additions included use of the Postgresql database for storing targets and observation logs, ability to execute image processing including astrometry and functionality of the true-time telescope corrections and a internet-based user interface. RTS2 was from the beginning created as a totally open supply system, with no any proprietary elements. In order to help growing list of mounts, sensors, CCDs and roof systems, it makes use of own, text primarily based communication protocol. The RTS2 system is described in papers appearing in 2004 and 2006.

The Instrument Neutral Distributed Interface (INDI) was began in 2003. In comparison to the Microsoft Windows centric ASCOM normal, INDI is a platform independent protocol developed by Elwood C. Downey of ClearSky Institute to support control, automation, information acquisition, and exchange among hardware devices and application frontends.


By 2004, robotic observations accounted for an overwhelming percentage of the published scientific information on asteroid orbits and discoveries, variable star research, supernova light curves and discoveries, comet orbits and gravitational microlensing observations.

All early phase Gamma ray burst observations had been carried by robotic telescopes.