67.4 F
Saturday, April 20, 2019

Drone / UAV

Home Drone / UAV

Terra Drone Indonesia teams with Japan’s leading power industry research institute to monitor transmission lines for PLN

Terra Drone Indonesia teams with Japan’s leading power industry research institute to monitor transmission lines for PLN 1
Crew members in Indonesia with Terra Wing drone

Terra Drone Indonesia and Central Research Institute of Electric Power Industry (CRIEPI) (Headquarter: Chiyoda-ku, Tokyo, President: Masanori Matsuura, hereinafter called CRIEPI) have successfully demonstrated how unmanned aerial vehicles (UAVs) can be used to monitor electricity transmission lines in Indonesia. The demonstration was done using Terra Wing drone at PT PLN (Persero) main transmission unit (UIT) in East Java and Bali (JBTB). In only 10 minutes, the drone was able to monitor 2 km of electricity lines covering four transmission towers.

As Indonesia’s state electricity corporation, PT PLN (Persero) has an obligation to ensure optimum power distribution in the country. To support this endeavor, several steps are taken, which are not only related to adequate power generation but also to its efficient transmission and distribution. Maintenance of transmission lines is one of those steps, and an important one to guarantee the availability of electricity throughout Indonesia.

The results obtained through drone monitoring are in the form of a point cloud or 3D model of the power line, derived using a software system developed jointly by Terra Drone and CRIEPI. This data can be analyzed to understand the distance between cables and to identify parts that have started to become loose. The data also acts as an efficient vegetation management solution because potentially disruptive foliage surrounding the power lines can be easily identified in great detail and with high accuracy using drones.

Terra Drone Indonesia teams with Japan’s leading power industry research institute to monitor transmission lines for PLN 2
Power line inspection report (example)
Terra Drone Indonesia teams with Japan’s leading power industry research institute to monitor transmission lines for PLN 3
Power line inspection report (example)

Terra Wing is a fixed-wing drone developed especially for topographic survey purposes in sectors like infrastructure, mining, agriculture, and utilities. In Southeast Asia, Terra Drone Indonesia is the first Drone Service Provider (DSP) to have used this drone technology.

PLN is buoyant about using drones because the technology has proven to reduce risk to human workers in high-voltage areas. Moreover, using drones is economical and results in high-quality images. Through this demonstration, Terra Drone Indonesia has proven that drone technology can be used to safely carry out mapping and monitoring in high-risk areas such as energy and utilities, and will continue to do in the future.

With more than 20 group companies already in its network, Terra Drone is always on a lookout to collaborate with cutting-edge drone technology companies. If you want to support Terra Drone on its quest to generate aerial innovations that have a great impact on our society, please contact [email protected].

*1 Official website of Central Research Institute of Electric Power Industry 

About Terra Drone Corporation
Terra Drone Corporation is the world’s leading provider of industrial drone solutions established in 2016. The company’s head office is located in Tokyo, Japan, with 20 branches globally throughout APAC, the European Union, and South America. Terra Drone provides innovative drone technologies empowered with LiDAR and photogrammetric surveying methods for construction, electricity, energy, and oil and gas sectors. Terra Drone specializes in high-performance hardware, cutting-edge software, drone services, and drone traffic management systems (UTM).

Terra Drone conducted its maiden global summit attended by the top executives of Terra Drone group companies from over 20 countries in early March.

SimActive Software Used with eBee X for Shoreline Mapping

SimActive Software Used with eBee X for Shoreline Mapping 4

Montreal, Canada, April 9th, 2019 – SimActive Inc., a world-leading developer of photogrammetry software, announces that Correlator3D™ is being used for mapping projects in Brittany, France by Altimedias.  An eBee X equipped with senseFly S.O.D.A. 3D camera is flown along the shoreline to produce high-resolution true orthomosaics and 3D models.

“The quality of outputs from Correlator3D™ is exceptional and the mosaic renders the vivid colours of the Pink Granite Coast”, said Didier Wasselin, COO at Altimedias. “Such results are very useful for heritage conservation and decision making by local authorities.”

“The combination of SimActive software and senseFly eBee Plus X is an ideal combination, due to the accurate RTK / PPK and optimized aerial triangulation”, said Francois Gervaix, Technical Advisor at SimActive. “The S.O.D.A. oblique imagery leads to outstanding 3D textured models.”

About Altimedias

Based in France, Altimedias is a leading firm specialized in drone data collection and processing.  Services offered include building facades and roof inspection, topological surveys, photogrammetric production and drone training. For more information, visit www.altimedias.fr

About SimActive

SimActive is the developer of Correlator3D™ software, a patented end-to-end photogrammetry solution for the generation of high-quality geospatial data from satellite and aerial imagery, including UAVs. Correlator3D™ performs aerial triangulation (AT) and produces dense digital surface models (DSM), digital terrain models (DTM), point clouds, orthomosaics, 3D models and vectorized 3D features. Powered by GPU technology and multi-core CPUs, Correlator3D™ ensures matchless processing speed to support rapid production of large datasets. SimActive has been selling Correlator3D™ to leading mapping firms and government organizations around the world, offering cutting-edge photogrammetry software backed by exceptional customer support. For more information, visit www.simactive.com.

How Drones Are Changing the Shipping Industry

How Drones Are Changing the Shipping Industry 5

Most people are familiar with the drones used to take aerial photographs or to fly around their backyards as hobbyists. Industries including the military also rely on drones for their operations. Here, we’ll look at the current and future impact of drones on the shipping industry.

Drones Making Deliveries to Offshore Ships

Companies are always looking for ways to streamline their operations. For many of them, that means seeing how drones might help. Airbus chose Singapore as its testing ground for a pilot that involves depending on drones to deliver goods to ships anchored offshore. The parcels have a maximum weight of approximately 3 pounds.

Efficiency is one of the notable advantages of using drones like this. The drones used for this trial complete a journey of almost a mile and make the entire trip in approximately 10 minutes. According to Airbus, the drones could handle slightly longer routes and heavier cargo. As such, these vehicles could give shipping companies another option when transporting goods to offshore vessels.

Using Drones to Move Medical Samples

Drones could also be advantageous when dealing with delicate, time-sensitive samples. At least, that’s the hope of a hospital system in North Carolina that’s using drones to carry patient specimens. For now, most of them get transported in courier cars, but the organization believes drones could be beneficial for avoiding issues such as road traffic backups.

The drones have rechargeable batteries and can travel more than 12 miles while holding samples weighing up to 5 pounds. The hospital carrying out this trial has multiple facilities, and drones may assist in getting samples to the pathology lab quicker than usual, especially when they come from relatively distant places.

In another instance that combines medical deliveries and drones, people in Ghana will receive medical supplies — including blood for transfusions — via drones. Supporters of this initiative say it could be exceptionally useful for getting things to remote areas substantially faster than road-based vehicles could.

A Push Towards Sustainability in Storage and Transport

Companies that ship things to their customers increasingly look towards ways to promote sustainability. They know doing that could save them money and please customers at the same time. For example, bulk bags store hundreds or thousands of pounds of dry products, keeping them safe during storage or transport.

Consider that 42 million bulk bags get used annually every year, representing an average annual growth rate of approximately 15 percent. Bulk bags are strong enough for reuse, letting companies cut down on wasted resources. Similarly, there are reusable shipping envelopes and boxes for smaller quantities of goods. When companies choose those, they can move towards more sustainable practices.

So, how do drones assist with increased sustainability? They can’t hold the large quantities that bulk bags do, but like bulk bags, they fit into the broader emphasis on sustainability in shipping.

According to a study, drones use less energy than trucks in some cases. More specifically, researchers found that in states like California that have access to sustainable fuels for drones, the carbon footprint is smaller for a drone trip than it would be in instances where states are more dependent on fossil fuels.

Plus, it’s important to keep in mind that people are hard at work creating cargo drones to carry significantly larger loads — maybe even bulk bags. For example, at one startup, tests are underway with prototype drones that are 30 feet long and carry 700 pounds and could travel as far as 2,500 nautical miles. These trials are just “taxi tests” on bodies of water now, but the research team will get these drones ready for their first flights in an upcoming phase.

A Federal Grant to Explore Drone Delivery Possibilities

Analysts predict that the worth of the drone logistics and transportation industry will surpass $29 billion by 2027. And, it’s arguable that an increase in dedicated research could help scientists overcome obstacles and make lasting progress.

That’s why it’s good news that a research team from Carnegie Mellon University received a federal grant worth $2.5 million and will use some of the funds to examine the energy implications of autonomous technologies including drones — specifically to handle the first and last-mile segments of deliveries.

Those are the most costly and energy-intensive portions. The researchers will also get input from companies in Pittsburgh, including Amazon, during their project.

It’s too early to say now what kind of impact this academic study might have. But, if successful, it could encourage more enterprises to consider drones for some of their shipping needs.

An Exciting Future in the Shipping Industry

These examples show how drones could drastically change how things get shipped. If that happens, items could reach their destination in a way that saves energy.

Author Bio:Emily is a green tech writer who covers topics in renewable energy and sustainable design. You can read more of her work on her blog, Conservation Folks.

Hangar Names Matt Hunter as Chief Product Officer

Hangar Names Matt Hunter as Chief Product Officer 6

AUSTIN, Texas–(BUSINESS WIRE)–Hangar Technology, Inc., a leading drone analytics company, today announced that industry veteran Matt Hunter is joining Hangar as Chief Product Officer to build on recent momentum and take Hangar software to the next level.

“Hangar is changing the way companies understand and utilize physical infrastructure, by unlocking meaningful intelligence at enterprise scale,” said Scott Lumish, CEO of Hangar. “Matt brings a unique mix of industry and product experience, and has a proven track record of cultivating product and engineering teams. As we expand our platform and put drones to increasingly valuable work, we look forward to the leadership he’ll add.”

Matt brings nearly two decades of experience leading the product teams of ambitious start-ups as they navigate periods of growth. He has worked in a diverse array of innovative industries, including mobile analytics, social video, and eCommerce. At App Annie, Matt helped scale the organization to over 500 employees and expand to 12 offices globally, establishing the firm as the leader in app market intelligence. Most recently, Matt ran product management, product marketing and design for San Francisco-based DroneDeploy. Collectively Matt has raised over $175M in funding from top VCs.

“We’re entering a new era of business intelligence where drones are making it possible to scan, measure and count every element of the physical environment,” says Matt Hunter, CPO. “That data creates insights which help businesses lower costs, increase productivity and ultimately drive faster growth. It’s a transformative technology, and Hangar is leading the way. I’m looking forward to joining Scott’s team and building on the great work this team has already done.”

About Hangar

Hangar is an integrated drone analytics platform for automating the collection and flow of drone data, as information is passed through hardware, software and service systems to become insight. Using proprietary capture technology that enables off-the-shelf drones to autonomously execute highly specialized missions, Hangar feeds valuable insights into business workflows at a volume otherwise impossible.

Hangar is transforming how industries understand locations, unlocking new business workflows that reimagine how companies monitor investments, perform maintenance and inventory assets – helping teams work more safely, efficiently and intelligently.

For information, visit http://www.Hangar.com.

Hangar Names Matt Hunter as Chief Product Officer 7

PODIUM announces U-space visitor events

PODIUM announces U-space visitor events 8

Brussels, Belgium – April 2, 2019 – Over the coming months, the SESAR Horizon 2020 PODIUM project will perform demonstrations of U-space services and technologies at five locations.

Those locations are the Drones Paris Region cluster (Bretigny, France), the Netherland RPAS Test Centre (Markenesse, The Netherlands), Groningen Airport (Eelde, The Netherlands), Hans Christian Andersen Airport (Odense, Denmark) and Rodez-Aveyron Airport (France). The demonstrations will take place until end June 2019.

PODIUM partners will perform BVLOS and VLOS flights covering eighteen operational scenarios, including multiple drone flights and in the vicinity of airports.

The project aims to collect and analyse feedback from drone operators, air traffic controllers, supervisors and authorities with a view to validating the ease-of-use and benefits of U-space. PODIUM seeks a strong engagement from stakeholders in order to strengthen its conclusions on maturity and recommendations for improvements.

For this reason, PODIUM is convening the following U-space visitor events:

  • May 14: “Unexpected” scenarios with security (joint event with SECOPS) 
    Hosted by NLR: Netherlands RPAS Test Centre – Marknesse, The Netherlands.
  • May 23: Regular drone usage and interoperability with manned aviation 
    Hosted by Integra Aerial Services and Naviair: Hans Christian Andersen Airport – Odense, Denmark.
  • June 4: “Unexpected” scenarios in an ATC controlled airport environment 
    Hosted by NLR: Groningen Airport – Eelde, The Netherlands.
  • June 13: Enhanced Business Operations
    Hosted by Drone Paris Region cluster – Bretigny, France.
  • June 26: Interoperability with ATC
    Hosted by Airbus: Rodez-Aveyron Airport, France.
  • October 17: Dissemination event
    EUROCONTROL headquarters, Brussels.

The respective site hosts will issue invitations to local stakeholders shortly. The PODIUM project coordinator, EUROCONTROL, will issue invitations and registration details to a wider set of stakeholders for the final dissemination event.

The latest information on PODIUM and the visitor events will be maintained on the website and the U-space PODIUM LinkedIn page.


PODIUM stands for Proving Operations of Drones with Initial Unmanned aircraft system traffic Management. The main PODIUM partners are EUROCONTROL (project coordinator), Airbus, Delair, Drone Paris Region, DSNA, Integra Aerial Services, Naviair, NLR, Orange and Unifly.

PODIUM supports U-space, the European vision for the safe, secure and efficient handling of drone traffic, and a key enabler for a growing drone market to generate economic and societal benefits.

Drone Registration Figures Confirm senseFly eBee as United States’ Most Popular Commercial Fixed-Wing

Drone Registration Figures Confirm senseFly eBee as  United States’ Most Popular Commercial Fixed-Wing 9

Raleigh, NC – April 3, 2019 – senseFly, the leading provider of fixed-wing drone solutions, has confirmed its position as the United States’ most popular fixed-wing drone provider, according to data provided by the FAA.

According to official FAA Part 107 commercial drone registration data, received by senseFly following an FOIA request with the FAA, senseFly small Unmanned Aerial Systems (sUAS or drones) account for almost half of all new commercial fixed-wing drone registrations, at 45%. The FAA’s figures, spanning the period January to September 2018—the latest data available—showed that last year’s registrations of the senseFly eBee are a full 29% ahead of the next fixed-wing provider on the list.

“These figures confirm senseFly’s professional eBee drone as the universal fixed-wing of choice, trusted by professionals to be their go-to aerial mapping solution,” commented Gilles Labossière, CEO of senseFly. “What we see is that industrial users are increasingly understanding the efficiency benefits a fixed-wing solution can bring over other sUAS platforms, such as quadcopters; primarily, that a fixed-wing drone’s greater endurance enables operators to complete medium- and larger-sized projects faster than a rotary drone. This helps to reduce in-field labour costs at a project level, and it enables operators to complete, and bill, more projects each week.” 


These US market share figures are reflected across the Atlantic in France, the only other country for which commercial drone figures are publicly available. According to the Dirección General de Aeronáutica Civil (DGAC)’s most recent listing of active commercial UAV operators (available here), senseFly professional drones account for over half of all commercial fixed-wing drone use in France, at 53%, a full 42% ahead of the next fixed-wing provider on the list.

To locate a senseFly distribution partner near you, visit senseFly’s Where to Buy page.

Recent Improvements To Drone Range

Recent Improvements To Drone Range 10

Drones, and specifically quadcopters, have come a long way since they were first used in the military as unmanned aerial vehicles.

The new age drones are as popular in film making and photography, as in mission-critical or rescue operations. Photography enthusiasts, explorers, and documentary makers are thrilled with the technology that lets them fly without themselves being airborne, all with a drone controller in their hands.

While there are many features to research on while contemplating the purchase of a drone, one of the most popular characteristics is the flying range of the drone. The range indicates the distance around the controller, up to which the drone can fly without experiencing any lag.

Over the years, there has been an improvement to drone range that has enabled some of the drones of today to have a flying range of up to 18 km (11 miles)! DJI once again leads the charge in this category.


DJI Mavic 2 – Released in 2018 with a control range of 18 km

As you read this, you may think that drones with this extensive control range would be limited to the commercial, but in fact, these drones are readily available in the retail market. If you are on the market for one, here is an article describing the best long-range drones available to the public.

Are you atop a rocky mountain, and wish your camera lens could capture the flora on the other side of the deep abyss in front of you? Your advanced quadcopter has you covered.

Interestingly, the flying range of a drone is not a separate feature but is composed of the technology behind several components of the machine. Let us look at some of the key parts that make up a long-range drone.

The Size

When it comes to size, long-range drones operate best when designed with optimum weight and dimensions. While a lighter and smaller drone might be easier to fly, it also needs sufficient battery power to go the distance. A supremely lightweight drone would thus be likely to fail if it is unable to carry the required weight of the battery.

The Battery

Now that we know a long-range drone needs a good battery, what do we define as good?

Well, your drone needs enough power to last the long flight time it has to cover and return safely to where the controller is. Typically, a 5000mAh battery should be the starting point for your drones to fly away and back without hiccups.

The Communication Equipment

It is easy to lose sight of your drone when it has a longer flight range, and hence, it is vital to have a strong and stable radio communication activated between the quadcopter and the controller at all times.

Drones that support dual frequency bands are often recommended to keep your drone under control on those long journeys and prevent it from crashing.

Automatic Navigation

While having a stable communication between the drone and the controller is a basic requirement, it is always a better ride for the drone if it is equipped with the intelligence to return or land safely in case of a power or a communication failure. This is the reason new age; long-range drones have a built-in global positioning system (GPS) capabilities.

Apart from these, there is another feature that you might want to look into before investing in a long-range drone, which is the camera. If the main purpose of owning a long-range drone is to search and discover places or things among difficult terrain, it must be equipped with an excellent camera to capture those discoveries.

Drones are technically equipped to bring back essential elements to humans from difficult to reach areas. Thanks to revolutionary technologies, long-range drones are now a reality.


“Ruaan is an electrical engineer from Olso, Norway. He is the owner and lead author of buybestquadcopter.com, and when not rambling on about drones, he can be seen flying his Phantom 4 around town.”

Sky Power receives ISO 9001:2015 Certification

Sky Power receives ISO 9001:2015 Certification 11

Highest quality, excellent customer care and an efficient, sustainable, lean manufacturing process in accordance with the Lean Production approach are the focus of the UAS engine manufacturer Sky Power GmbH from Germany. This has now been confirmed with the successful ISO certification.

“The certification is important for us as an international company. As a German engine manufacturer, we carry the unofficial quality label Made in Germany, which now is officially confirmed by the certification” explains managing director Karl Schudt. ISO 9001 defines the requirements for quality management. “Our customers expect the highest quality, customer orientation and flexibility from us. We fulfil this expectation since the company was founded and have thus gained a good reputation within the UAS industry in a short time. Therefore, it is even more important for us that an external certification organization attests that we meet these self-imposed requirements and are on the right track, “adds Schudt.


Sky Power GmbH is a leading manufacturer of 2-stroke combustion engines and Wankel engines for Unmanned Aerial Systems (UAS) as well as hybrid applications. In addition to its own development and production, Sky Power produces all engines in Germany. Customizations, new developments and the expansion of the performance of the internal combustion engines are a further corporate goal.

A Crash Intro to AI-powered Object Detection

A Crash Intro to AI-powered Object Detection 12

Almost every day, we hear we can use the superpowers of AI to make our work easier, automate routine tasks, speed up our workflows, increase accuracy and make more money. In the drone industry —especially in the geospatial sector— we hear a lot about how AI will help us extract actionable information from unstructured image data at a scale and speed never previously seen.

We are quite familiar with the potential of AI, but what if you are not a machine learning expert?
The good news is you don’t need to be a machine learning expert —and you don’t need to hire one— to harness the power of AI. Picterra has created an online platform —with an easy-to-use graphic user interface— to make AI-powered object detection accessible to everyone.

The signature tool of the platform is the custom detector, which allows you to train your own AI detection model in just a few steps, without writing a single line of code.

All you need to know to get the custom detector working

AI models can be good students, but they are not human. They lack your intuition and they see things differently. You need to teach them to see the world through your eyes.

To train an AI model to detect objects in an image you need to tell the algorithm WHERE it should learn relevant information and show it examples of WHAT it should —and shouldn’t— learn to find.

The first step is to understand how you “see” objects. Think about it. How do you define what the object you are looking for looks like? How do you identify a single unit of this type of object? What are the visual features for which you are looking? Is it the shape? The color? The size? The texture? A concrete part of the object? The combination of all of them under certain circumstances?
Once you have identified the key visual features that define your object of interest, you can teach the AI model to find it.

For demonstration purposes, we will walk through a challenging sheep detection project, using the custom detector tool on the Picterra platform.


This is the image before adding any training information:

Picterra_A crash intro into AI powered Object detection_What you see before adding training areas

On the left is what you can see; on the right is what the AI model can see before you tell it where to look. Exactly, it sees NOTHING!

You need to tell the AI model to open its eyes and provide it with information it can “see” to learn from.

Where should you define training areas?

Analyze your image and find spots where you have examples of your object of interest and spots where you don’t have them. These spots are called “training areas”. The algorithm will look at them in order to learn.

Select some of them to tell the algorithm WHERE it should look for examples of both what you are interested in and what you are not interested in. Keep in mind that the AI model won’t learn from the other sections of your image that you didn’t highlight.

Where you have examples: highlight them

These are sections of your image that you highlight to tell the algorithm, “look at this region, here are the examples of what I need you to find”.

Each training area should contain multiple examples of your object of interest. It is important to draw a series of training areas that highlight your objects of interest in different contexts.

You want to identify sections of your image where your objects of interest appear on different backgrounds, in different distribution configurations, or in different lighting conditions:

Picterra_A crash intro into AI powered Object detection_sheep in different backgrounds

Here the human intelligence in charge is telling the AI model to have a look at these sections of the image. At this stage, only the human knows what is in the selected spots —sheep on a background in full shadow, sheep on the grass, and sheep on the bare ground.

Where you don’t have examples: define counterexample areas

Defining areas where you know there are not examples of your object of interest helps the algorithm by enabling it to understand what you are NOT looking for looks like.
The AI model will use these sections of your image as counterexamples. It is particularly helpful to draw the attention of the algorithm to areas where you have objects that look similar to your object of interest, but which are not that for which you are looking. It usually also helps if you include spots that are pure background.

Picterra_A crash intro into AI powered Object detection_empty training areas

Here we are telling the AI model to have a look at these sections of the image. It doesn’t know it yet, but we will use these spots to teach it that bushes, grass, and dogs are not sheep.

The algorithm will learn what sheep look like by having a look at both the training containing examples and the empty training areas containing counterexamples.

Once we have defined the training areas, the AI model knows where to look for information.

Picterra_A crash intro into AI powered Object detection_What you see with training areas

On the left is what you can see; on the right is what the AI model can see once you have added training areas.


You have already told the algorithm WHERE are the regions on which it should keep its “eyes” focused. Now it is time to tell it WHAT it should look for.

You should start by identifying the visual features that define the object you are interested in.
In order to do so, you need to think about what helps you recognize an object as such.
The next step is outlining —in other words annotating— these objects. This is the way you communicate to the algorithm WHAT you need it to learn to find.

How should you draw your annotations?

Learning how to draw your annotations is an intuitive and experimental process.
How do you define a “unit” of this type of object? What is the key visual factor you “see”?
Is it the full object? Or is a specific and distinctive part of it?
In this case, we started with full-body outlines:

Picterra_A crash intro into AI powered Object detection_annotated areas

We drew polygons to outline the sheep.

Make sure you annotate all the relevant objects contained in your training area and the ones crossing its boundary. Keep in mind that anything contained in a training area that is not highlighted as an example will be considered a counterexample.

That’s everything the model needs to start doing its homework. Now you can click “Build & Test Detector” to train the model and get it to detect objects on the rest of your image.

Now, let’s check what the algorithm learned:

Picterra_A crash intro into AI powered Object detection_detection output

You can see the AI model learned what sheep look like and detected all the sheep – and only the sheep.

However, upon a closer look, in the areas where they are very close to each other the sheep were not detected as individual objects:

Picterra_A crash intro into AI powered Object detection_output merged detections

All sheep were detected, but due to the proximity of their bodies, some of the detections are merged.

Do you want to go further than detecting the sheep? Do you want to count them?

In this image, the sheep are standing very close to each other, making it a very challenging project to count them individually. We know that the way you annotate an object has an influence on the output, so we decided to explore a few variations in the method of drawing the annotations to check how it affected the outputs.

Let’s compare how annotating the objects differently produces different outputs. For reference, in this image, the known sheep count is 433.

Outlining full body of the sheep:

Picterra_A crash intro into AI powered Object detection_annotation no inset

Originally, the detection output has a number of merged detections so it gives an object count of 71 sheep:

A Crash Intro to AI-powered Object Detection 13

The AI model detected 16.4% of the sheep as individual objects.

Using a different drawing method, this time insetting the contour of the full body:

Picterra_A crash intro into AI powered Object detection_Inset full body annotation

The detection output has fewer merged detections and an object count of 396 sheep:

Picterra_A crash intro into AI powered Object detection_

This method allowed the model to detect 91.5% of the sheep as individual objects.

Using circles to annotate the heads of the sheep:

Picterra_A crash intro into AI powered Object detection_circle annotations

The output has even fewer merged detections, but a few sheep were still not detected, with an overall headcount of 416:

Picterra_A crash intro into AI powered Object detection_Detection output circle

The model detected 96% of the sheep as individual objects.

You can have a closer look and explore the annotations and the outputs generated in this project on the Picterra platform.

How to get it wrong?

Keep in mind training and customizing an AI detection model is an intuitive and iterative process: you will need to explore and test what works best for each type of object you want to detect.

However, there are certain mistakes you can avoid:

  • Not defining training areas. Even if you annotated objects, without training areas this is what the model will see – NOTHING:
  • Defining large training areas containing a few small annotations. When you define training areas that contain examples, as a rule of thumb 20 to 40% of the space inside the training area should be covered by annotations. If you want to add a counter-example area, add a separate one:
  • Defining very small training areas. Your training area should contain your objects of interest and a fair amount of background. This will help the model understand how the context area in which it would find your object of interest looks. Try to balance the size of the area of annotated and non-annotated elements:
  • Wrapping a training area around a single object:
  • Defining too many areas containing examples and very few containing counterexamples or vice versa. Again, it’s all about balance and making sure you included a variety of both, examples and counterexamples:
  • Annotations containing very few pixels are likely to give bad results. For this point take into consideration our recommendations regarding the link the size of the object and the image resolution. Imagine you were trying to detect our woolly friends in an image with lower resolution. It wouldn’t have worked, because the amount of information you would be giving to the AI model wouldn’t have been sufficient for it to understand how the pixels you selected are different from other sections of the image:
  • Not annotating all of the examples contained in your training area:
  • Not annotating examples that are not fully contained in a training area. You want them to be considered —if you don’t annotate them they will be considered conterexamples—:
  • Overlapping annotations when the end goal is counting individual objects:

Build your own AI detector

Discover what type of annotations work best for the type of object you need to track and for the context they are in.
You might be trying to detect a type of object that has a totally different shape, pattern, and color. These objects might appear distributed throughout your image or might be grouped in a different pattern.

Picterra_A crash intro into AI powered Object detection_different examples

There are many possible variables, but the good news is the custom detector tool allows you to experiment, tweak and fine-tune the model to your needs. As you build and refine your detector you will gain experience and intuition and learn how to best take advantage of the power of AI.

You can find step by step instructions here.

Drone Solutions Launches the Centre of Drone Excellence

Drone Solutions Launches the Centre of Drone Excellence 14

Singapore, March 19, 2019: Drone Solution Services Pte Ltd officially announced today the commencement of its Centre of Drone Excellence activities which will be highlighted at its presentation booth at the Unmanned Systems Asia 2019 expositions which will be held April 9-11, 2019 at the Changi Exposition Centre in Singapore.

Centre of Drone Excellence

The Centre staffed with engineers from a myriad of drone related disciplines including mechanical, electrical, mechatronics, AI, acoustics/sonar and robotics engineering represents the local interests of international companies having advanced aerial, undersea and terrestrial based drone technologies and associated products; providing them with marketing, sales, technical presence and engineering support in greater Asia.

For those entering the Asian marketplace the Centre “takes the ouch out of new market entry”, offering a plethora of services:

– Centre Staff give product & technical presentations; holds (new) product demonstrations, gives client pitches; undertake product trainings and gives customer (technical) support; – Administratively arrange governmental formalities and legalities for Members; including acting as an Interlocutor for their goods and services; – Provides full administrative infrastructure and support staffing including office space, meeting/ conference/demo and board rooms; fulfilling customer service needs at the service levels they require; – Coordinates Member trade fair, expo and congress participation, infrastructure and staffing as and where necessary;

– Provides multi-lingual staff with Mandarin, Cantonese, Japanese, Bahasa Melayu, Bahasa Indonesia, Korean, Thai, Burmese, French, Italian, German, Dutch & English skills enabling the furtherance of their commercial aspirations.


Drone Solutions Drone Solution Services Pte Ltd is a privately-held Singapore company engaged in researching, designing, developing, manufacturing, marketing, selling, licensing and patenting proprietary intellectual property right related to Unmanned Systems Technology (“UST”) products.

The Company was formed with an undertaking to facilitate the improved scientific development of UST’s with a vision to make drone technology universally available and accessible to society at large as well as in the commercial arena; and to promote the widespread use, acceptance, and integration of drone technology for humanitarian aid usage and for the overall betterment of the general public.

Drone Solutions is in a constant pursuit of excellence through technological innovation and novelty enabling it to provide bespoke and advanced state-of-the-art unmanned aerial (“UAS”), unmanned underwater (“UUV”) and unmanned ground (“UGV”) solutions across a diverse range of industrial segments.

Follow threeblocksaway | styleandeasy