Sponsored by Blighter Surveillance Systems
19 Jan 22. Get ready for ‘computer-assisted’ shooting with the US Army’s new optic. Improved range calculation, infrared vision, target tracking and more await troops in the new optic recently chosen for the Next Generation Squad Weapon. How soldiers see their downrange target, from basic grunt to advanced sniper, is about to go high-tech.
The Army recently announced its selection of Vortex Optics/Sheltered Wings as the producer of its Next Generation Squad Weapon Fire Control on a $2.7bn, 10-year contract for 250,000 devices.
Traditional optics essentially just magnify the sight picture for the shooter and provide a reticle to align a sight to the target for better accuracy than what iron sights can provide.
But the new NGSW Fire Control mimics in some ways what’s been available to gunners on ground vehicle platforms such as the Abrams tank, Stryker and Bradley Fighting Vehicle for decades — computer-assisted shooting.
That’s one reason the NGSW Fire Control will eventually replace the close combat optic, rifle combat optic and machine gun optic within the close combat fighting force. That force is primarily special operations, infantry and immediate infantry support such as scouts and combat engineers.
This year, the Army is also expected to select a Next Generation Squad Weapon prototype to replace the M4, M16 and M249 Squad Automatic Weapon.
The NGSW will fire a never-before-seen cartridge — a government-developed 6.8 mm round. That round falls into what engineers call the “intermediate caliber” range. It’s noticeably larger than the 5.56 mm in squad weaponry but only slightly smaller than the heavier 7.62 mm used for medium machine guns such as the M240 at the platoon level.
But, according to Army experts, the 6.8 mm outperforms both existing rounds for range, accuracy and lethality while coming in at a lighter weight than the 7.62 mm.
While the new weapon will be built around that round, the new fire control is platform agnostic.
Matt Walker, a retired Army command sergeant major and career Ranger now with the Army’s Cross Functional Team-Soldier Lethality, told Army Times that the ballistics computer inside of the NGSW Fire Control allows it to be used with just about any existing individual or crew served weapon. And thanks to software upgrades and data changes, new caliber combinations could be added in the future.
“We put just about every ballistic calculation you could put into it,” Walker said.
The Army expects future shooting engagements to take place over great distances. Studies showed that 5.56 mm rounds were less than effective at standard fighting distances in the mountains of Afghanistan, for instance.
Developers determined what the round needed to do on target and then worked backwards from there to the platform that would need to launch that projectile, Walker said.
“And obviously, if you don’t hit the target you get none of the effects that we just described,” Walker said.
So, other than the shooter, the NGSW Fire Control is in some ways the most important part of getting that round on target.
The NGSW Fire Control will contain the following features in one device:
- Variable magnification optic
- Backup etched reticle
- Laser rangefinder
- Ballistic calculator
- Atmospheric sensor suite
- Intra-soldier wireless
- Visible and infrared aiming lasers
- Digital display overlay
The wireless feature should allow it to connect with another soldier device under development, the Integrated Visual Augmentation System, a kind of “do-it-all” goggle that is going through testing and evaluation this year.
The IVAS features augmented reality capabilities based on the Microsoft HoloLens virtual reality goggle. The ruggedized version for military use will eventually include a Rapid Target Acquisition software that allows shooters to toggle through displays, seeing a camera view from the goggle, from the weapon sight and a picture-in-picture to see both.
The goggle also allows users to see through dust, smoke and other obscurants, use a thermal view and use navigation headings, wayfinding points and other situational awareness tools, such as wireless drone feed footage, in real-time.
But even without the goggle, the NGSW Fire Control can use the rapid target acquisition feature to better lock onto targets. An in-line thermal sight can be added in front of the device to give thermal options as well.
The user won’t even have to remove the fire control to mount the thermal, which is the case with some optics now.
The Army plans picked fire control (optic) for its Next Generation Squad Weapon in early January 2022. A downselect of the remaining prototypes of the actual weapon is expected in early 2022. (Jacki Belker/Staff)
The host of capabilities inside of the device might seem overwhelming to a new infantry soldier or Marine. But the device can be tailored to the needs of each individual user, Walker said.
“We’re not going to do away with training, the most important thing we can do with our soldiers is train them to be effective,” Walker added.
That means soldiers are still going to be spending plenty of trigger time on iron sights, basic optics and shooting the M4 carbine before they shoulder the NGSW and use its space-age fire control.
While the fire control does many things not previously seen in one package, one hurdle that remains is windage — how the wind affects the round’s trajectory downrange.
Troops can simply eyeball flags or windsocks on known distance ranges and adjust fire to hit the target, but the means to do that using technology is still too large to fit atop an individual weapon, Walker said.
Maj. Wyatt Ottmar, also with CFT-SL, told Army Times that a total of 1,000 soldiers with 101st Airborne Division, 10th Mountain Division and 82nd Airborne Division ran through 25,000 hours of evaluating the entire NGSW program, to include the fire control, as it progressed through development.
The Army is also in the midst of a Platoon Arms Ammunition Configuration study that is likely to determine what will replace the M240 medium machine gun, potentially a counter-defilade weapon and other platoon-level firepower decisions. More to follow on that one.
The Army is currently formulating the fielding plan for the NGSW Fire Control, which will include what unit will be the first equipped. But Walker did share that while the fire control will eventually reach the schoolhouses for infantry, scouts and engineers, it is first going to the operational force. (Source: Army Times)
18 Jan 22. High Performance, Reliable LWIR Thermal Imaging with Teledyne FLIR Tau 2 Series. The Tau 2 series offers 40+ variants of the most reliable and rugged longwave infrared (LWIR) thermal camera module. Equipped with a simple optical interface and common commercial interfaces, integration with the Tau 2 and new Tau 2+ is seamless. Teledyne FLIR has you covered with high performance and reliable longwave infrared (LWIR) Tau 2 series thermal camera modules. Common interfaces and access to a US-based Technical Services team reduce development risk, development cost, and shorten your time to market. If you are developing with Tau 2, we can help.
With an unmatched combination of features and reliability, the Tau 2 and the increased sensitivity Tau 2+ are well suited for integration in demanding applications including UAVs, unmanned applications, handheld imagers, security cameras, maritime cameras, and thermal sights. Integration is seamless with common optical, mechanical, and electrical interfaces and the most lens options available on the market. See what Tau 2 solution is best for your application.
18 Jan 22. British E-3D Sentry ‘AWACS’ aircraft to be sold to Chile. A source close to the now-retired E-3D Sentry fleet has told the UK Defence Journal that Chile is looking to purchase “more than one” retired E-3D aircraft from Britain. One of the aircraft has already been sold to the United States, to be used as a dedicated trainer supporting its E-6B Mercury airborne communications and command post fleet. The UK originally operated seven of the aircraft type. In December 2020, only three remained in service after one was withdrawn from service in 2009 to be used as spares, two were withdrawn in March 2019 and a further one withdrawn in January 2020.
It is unknown how many Chile intend to purchase but I have been told that it will be “more than one”.
I have contacted the Ministry of Defence for comment on this news and I will update this article when I receive a response.
The ‘E-3D’ variant features CFM56 engines and some British modifications and was designated Sentry AEW.1 in RAF service. Modifications included the addition of a refuelling probe next to the existing boom AAR receptacle, wingtip ESM pods, an enhanced Maritime Surveillance Capability offering ‘Maritime Scan-Scan Processing’ plus JTIDS and Havequick 2 radios.
The RAF’s E-3 Sentry airborne early warning aircraft fleet was retired in September with their replacement, the E-7 Wedgetail, not due until 2023. The UK will rely on the NATO Airborne Early Warning and Control Force to plug the gap.
The first two of three E-7 Wedgetail airborne early warning aircraft for the Royal Air Force are starting to take shape. Air Marshal Andrew Turner of the Royal Air Force tweeted the following: STS Aviation is converting three Boeing 737 airliners into E-7 Wedgetail airborne early warning aircraft at its facility at Birmingham Airport.
Wedgetail is an airborne early warning and control system, commonly known as AWACs or AEW&C. They are designed to track multiple targets at sea or in the air over a considerable area for long periods of time. This aircraft is replacing the E-3D Sentry, pictured below.
The plan, previously, was five aircraft but the recent ‘Defence Command Paper’ reduced the order from five to three. The Defence Command Paper released, titled ‘Defence in a Competitive Age’, stated:
“We will retire the E 3D Sentry in 2021, as part of the transition to the more modern and more capable fleet of three E 7A Wedgetail in 2023. The E 7A will transform our UK Airborne Early Warning and Control capability and the UK’s contribution to NATO. The nine P 8A Poseidon maritime patrol aircraft will help to secure our seas.”
The first of the E-7 Wedgetails purchased by the UK to replace the E-3 Sentry Airborne Warning And Control aircraft will arrive in 2023. (Source: News Now/UKDJ)
18 Jan 22. Leading global eyewear manufacturer Bollé Safety launches new brand division Bollé Safety Standard Issue for its range of tactical eyewear. Bollé Safety, the world renowned leading manufacturer of safety glasses and goggles, today announced the rebranding of its Bollé Tactical division to become Bollé Safety Standard Issue. The new Bollé Safety Standard Issue range of products have been designed to offer unparalleled eye protection for law enforcement, first responders, firefighters, EMS, military, and special forces as well as shooting enthusiasts and hunters. The company’s mission is to give tactical first responders the competitive advantage they need to do their job in the safest way possible. Bollé Safety Standard Issue creates protective eyewear that uses the most advanced technology to ensure their customers not only see the best, but also look the best.
Each Bollé Safety Standard Issue product is a tailor-made solution answering the specific needs of each service provider. Their entire protective eyeglasses range is compliant with all safety standards globally.
“Our mission is to provide the best eye protection to those who need it and the people who protect us, beyond just the military field, so renaming the Bollé Tactical Division was evident,” explained Rubina Meunier, Vice President of Brand from Bollé Safety. “Bollé Safety Standard Issue is a more comprehensive name, with the ‘Standard Issue’ phrasing resonating globally and the new logo being catchy, bold, and a bit mysterious.”
Bollé Safety began its origins as part of the Bollé Family as a small workshop in France over 130 years ago, and became the pioneer in creating tactical eyewear in the late 1950s. Since then, Bollé Safety has become a leading global manufacturer of protective eyewear that is worn by first responders, military, healthcare professionals, industrial workers, shooting enthusiasts and hunters worldwide. To find out more about the new brand division please visit https://www.bolle-safety.com/tactical.html(Source: PR Newswire)
14 Jan 22. How Silicon Valley Is Helping the Pentagon Automate Finding Targets. When the Pentagon asked Google to quietly build a tool that could identify objects in drone footage, what it found was a nascent worker revolt against explicitly building weapons.
The program, Project Maven, was launched in April 2017 with the goal of processing video taken by drones. The plan was to identify and label objects in those videos using computer algorithms, with those labels helpful for troops who would be picking targets.
It presented the possibility that the military could finally process all the thousands of hours of video collected by drones, much of which typically sat unused and unwatched, and then rapidly make life or death decisions relying on trusted computer algorithms.
But in March 2018, Google workers raised internal objections to the company’s participation in the project, before coming out with a public letter arguing that Google should not be “[b]uilding this technology to assist the US Government in military surveillance — and potentially lethal outcomes.”
In response to the letter, and the resignations of multiple employees, Google announced it would not renew the contract and published a set of guiding principles for how it would use and develop artificial intelligence, or AI. While Google maintains military contracts to this day, Project Maven hangs over the Pentagon and all of Silicon Valley as a cautionary tale about applying commercial code to military ends.
But Maven was hardly the first time the Pentagon contracted tech companies to build an object recognition tool. A year before Maven got up and running, the Air Force Research Lab signed a contract on a little-noticed program called VIGILANT, details of which were disclosed in October 2021 as part of a Freedom of Information Act request.
“VIsual Global InteLligence and ANalytics Toolkit”, or VIGILANT, was first commissioned in 2016.
While the Air Force has not publicly disclosed details of its deal with Kitware, a New York-based technology company, for VIGILANT, documents about the regularly changing contract offer some insight into how the military hopes to adapt the kind of data processing that thrives in Silicon Valley to use in wars abroad.
The contracts reveal a desire to increase the fundamental tempo of intelligence collection and, with it, targeting. The promise is time savings from algorithms taking a first sweep for potential targets. The possible follow-on effects are impenetrable targeting tools, with errors as classified as airstrikes and harder to attribute.
Crucial to developing algorithms to identify objects is training this processing on synthetic data, or fake data concocted from fragments of real information. Creating that data has long been a part of training automated identification algorithms. It’s how tech companies regularly prepare machines to operate in the real world, where they might encounter rare events.
To create this synthetic data for drone footage, the Air Force Research Lab turned to Kitware, a company with an existing framework for processing data from multiple sources. The software is open source, meaning that all its code is publicly disclosed so that programmers can develop and riff off of its initial kernel of code.
When reached for comment, a spokesperson for Kitware said the company did not have permission to speak publicly about the program.
The documents outlining the contract with Kitware described how the Air Force thought the technology could be used for everything from combat to farming.
“This demonstration system will be delivered for analyst assessment and transition into operations at Air Force, the intelligence community and commercial companies,” reads the contract award. “With great potential for both military and commercial analytics, applications range from mapping crop damage in precision agriculture to commercial vehicle counting to adversary order of battle monitoring.”
VIGILANT, at least in 2016, was pitched as both sword-adjacent and a ready-made plowshare.
The Pentagon Gets Interested in Automated Target Recognition
The military had begun looking in earnest at object recognition after academic researchers demonstrated in a very narrow 2015 test that computers could do better than humans at labeling objects.
“As far as the Department of Defense was concerned, that was an important day,” former Deputy Secretary of Defense Robert Work said in an interview with The Atlantic.
The 2016 contract award for VIGILANT focused only on satellite footage, and came with the premise that it would release “both the framework and the analytics as open source,” facilitating its use by organizations outside of government.
A year after the initial award of VIGILANT, the DoD started funding Project Maven, with the stated objective of developing algorithms to label data in drone-collected videos, and then figure out how the military could incorporate those algorithms in planning.
Maven, like VIGILANT, was about using AI to shift the balance of time. In May 2017, John Shanahan was an Air Force lieutenant general overseeing a range of emerging tech acquisitions. He told Defense One that the goal of Maven was to clean up video, “finding the juicy parts where there’s activity and then labeling the data.” This would replace the work done at the time by three-person teams of analysts. With the AI doing a first pass over the video, the human analysts would in theory be able to devote more of their time to confirming highlighted findings, instead of discovering and analyzing changes themselves.
“Project Maven focuses on computer vision — an aspect of machine learning and deep learning — that autonomously extracts objects of interest from moving or still imagery,” Marine Corps Col. Drew Cukor, head of the Algorithmic Warfare Cross-Function team, said in a DoD press release in July 2017.
While VIGILANT trained on satellite imagery, and Maven on drone images, any hard separations between the programs would blur with the launch of VIGILANT 2.
VIGILANT 2, awarded by the Air Force to Kitware in February 2018, expanded the focus to electro-optical sensors, primarily but not exclusively on satellites. Working with data across a range of sources was a goal of the software outlined in the first contract. In VIGILANT 2, it becomes explicit, with the contract noting, “While commercial satellite data will be the focus of this effort the technology developed can be applied to other electro-optical platforms both in the air and space domains.”
That meant building a model with, and for, data from satellites and data from aircraft, including footage recorded by military drones. To be most useful, the algorithm processing that data would have to be able to work across a range of sensors. In practice, the algorithm was to deliver “change detection capabilities,” by contrasting recent images with past collected imagery. That’s a data processing challenge and a data-identifying challenge.
When done by open-source analysts, change detection often starts with identifying a point of interest, and then looking backward in time at earlier images while fixated on that point to recognize any small changes. The process, useful for investigations, is a kind of after-the-fact assessment. For the military, which wants to direct the movement of people and vehicles in real time, automated analysis could identify changes of military significance faster – at least, that’s the idea.
If the algorithm could correctly identify a vehicle, and if it could track it across satellite and drone footage quickly enough, and comprehensively enough, then what VIGILANT 2 offered would be the means to see and follow that vehicle’s movements across a country, possibly leading the military to locate and then target insurgent networks. That premise, tantalizing as it is, comes with deep caveats at every stage, from the specificity of tracking to even correctly identifying a vehicle in the first place.
Models of Failure
Accurately identifying a target, especially at distance from an actual firefight, can be a difficult task. Doing so in accordance with the laws of war, which set standards for when members of a military are legally allowed to pull triggers, is an elaborate process, one for which the Joint Chiefs of Staff published a 230-page manual in 2016. The manual, which was made available to the public in November of that year, emphasizes the importance of correctly identifying a target, noting, “In extreme cases, failure to exercise due diligence in target development can result in outcomes that have negative strategic repercussions for the United States and its allies.”
Because so much in military targeting hinges on the identification being correct, it makes the stakes for any targeting tools high. This is especially difficult in a field, like computer vision, where errors are an almost inevitable part of development.
“Machine learning is only ever going to be as good as the quality of its labels. It is only ever going to be that good,” said Liz O’Sullivan, who in January 2019 left a job at the tech company Clarifai over objections to that company’s work on Project Maven. “If you have too many blue and orange people in your data set, and you’re trying to identify pink people, it’s not going to work as well. And it’s going to have a higher error rate. What degree of error rate is acceptable when you’re taking a human life?”
O’Sullivan left Clarifai after realizing that the company’s technology would end up being used for military operations.
“Part of my journey was basically realizing that if you want to advance the science of object detection, you’re just inadvertently going to be contributing to this global arms race. And so it didn’t make any sense for me to ultimately continue down that path, ” she said.
The Air Force Research Laboratory, which commissioned VIGILANT 2, did not respond to a request to comment for this story.
One of the special challenges for VIGILANT 2 is that it was training not only with publicly available data, but also classified data. It would also incorporate synthetic data into its analysis, so that the object identification algorithm could learn to find certain types of objects for the military without those items having been captured by satellites or drone footage. In these cases, Kitware would build 3D models of objects and then incorporate them into the process of object identification.
Incorporating synthetic data for rare or hard-to-observe events is a fairly common process in training identification algorithms. The stakes of doing it for the military, and generating synthetic data that anticipates as-yet undetected possible targets, risks targeting decisions being made on imagined fears, coded through AI into legibility.
“They’re trying to use computer vision to read minds, and it’s not a crystal ball,” said O’Sullivan. “It can’t see through walls, and the assertion that it can infer intent based on patterns is just outrageously naive.”
Consider the steps one would take to identify a suspected chemical weapons facility. Are there external tanks or barrels? If those are known and modeled, can that model be reasonably incorporated into existing footage of facilities? And what happens if a building with such a setup is selected as a valid target? If the intelligence, modeling and targeting were all correct, then it’s possible a strike on such a facility would meet its military objectives.
If any part of it was wrong, then a wholly innocuous facility that just happens to look like a valid target ends up destroyed, and lives are likely lost, too.
Some strikes, and the possibility of helpful data, are deliberately left off video. In a Dec. 12, 2021, story, The New York Times detailed the actions of Talon Anvil, a military unit charged with finding targets for the U.S. war against ISIS in Syria and Iraq. Talon Anvil operated from 2014 through 2019, a timeline that overlaps with the first known operational uses of Project Maven for computer-assisted object identification and targeting.
As reported, in a move designed to avoid accountability for casualties inflicted by its strikes, “Talon Anvil started directing drone cameras away from targets shortly before a strike hit, preventing the collection of video evidence.”
In an update to the VIGILANT 2 contract, from September 2020, a work order specifically requested that more of this data labeling work be done by unsupervised machine learning. Unsupervised learning is a process that puts a tremendous amount of trust in AI to find similarities in images, and then group those found objects together in useful categories, rather than having a human dictate those categories.
This was a design call that leans toward speed of identification over accuracy, making more labels available quickly at the expense of training the algorithm to more accurately find known quantities.
“The contractor will explore techniques and algorithms that will provide the USAF with a high degree of [Automatic Target Recognition] flexibility by producing new Deep Learning based models that can be trained as fast as possible with the lofty goal of hours,” reads the contract.
This is in contrast to existing targeting systems, which can take months or years to build and are sensor-specific. By asking Kitware to build Automatic Target Recognition that can load in hours onto a new camera, the Air Force is suggesting that the process itself is sufficiently trustworthy to be put into combat rapidly.
Another addition outlined in the VIGILANT 2 contract is an emphasis on incorporating sensors from satellites as well as sensors from aircraft into the same analysis. The Air Force also requested that the identification software be designed to fit on smaller devices, able to run entirely as much as possible on a satellite or a plane.
While this is still specified as a “user-in-the-loop” technique, meaning a person would still be involved in the analysis, the ability to process intelligence on a machine without sending it back to the computer of a human operator means the human would, at best, be approving targeting assessments made by an algorithm, instead of having the option to review both the data and the assessment.
In the world of public and peer-reviewed research on computer vision, examples of algorithmic error abound. In one of the more famous examples, researchers had a trained algorithm that correctly identified a Granny Smith apple with 85% confidence. But when the researchers instead put a paper on the apple that read “iPod,” the algorithm said it was an iPod with 99.7% confidence.
Placing trust in algorithms for targeting decisions, even if it is just the initial winnowing down of collected evidence, means leaving military actions open to unique errors from machine intelligence.
The military is investigating some of these limitations, and the results are not promising. This month, Air Force Maj. Gen. Daniel Simpson described a target-recognition AI, trained on a specific angle of a missile. Fed a different angle of that same missile, the algorithm correctly identified it only 25% of the time. But more troubling for Simpson, Defense One reports, is that “It was confident that it was right 90 percent of the time, so it was confidently wrong. And that’s not the algorithm’s fault. It’s because we fed it the wrong training data.”
VIGILANT by Kitware is just one of a host of software tools built to bring data processing from the online worlds of Silicon Valley to the life-or-death stakes of military operations. As of September 2020, VIGILANT 2’s contract award was for up to almost $8 m, or about 1/10th the cost of a single F-35A Joint Strike Fighter.
Yet despite its huge promise at discount rates, the technology is fundamentally trying to solve a human problem. AI can look for patterns in video footage, and it can make approximated guesses about what objects it has found on the ground below. But it is doing so at human direction, from where the cameras are put to what kinds of objects it is told to look for. What AI mostly adds to that is a kind of distancing between the fallible decisions of intelligence collection and the false certainty of algorithmic assessment.
“We’ve always been worried that the military is going to take their horrible track record of drone violence and use that as a training set to generate a model,” O’Sullivan said. “And in doing so lock in all of the mistakes that we’ve made in the past as predictive of being what kinds of mistakes will happen in the future. And I think there’s a really great risk of that happening with this case.” (Source: Military.com)
15 Jan 22. DARPA wants to build slimmer, enhanced night vision goggles. The Pentagon’s blue-sky R&D agency chose 10 partners to help make goggles easier on soldiers’ necks and expand vision in the dark.
The Pentagon’s blue-sky research agency recently announced that it selected 10 organizations to develop enhanced, less bulky night vision goggles to keep the crimp out of servicemembers’ necks. And so they can see better at night.
“Current night vision (NV) systems are bulky and heavy, resulting in a significant torque on the wearer’s neck. This torque greatly limits the wearer’s agility and often leads to chronic injury over prolonged use,” said a Defense Advanced Research Projects Agency announcement Wednesday. “Additionally, existing NV devices only provide a narrow field of view (FOV) and are limited to the near-infrared (IR) spectral bands, greatly limiting situational awareness in varied night conditions.”
The program, dubbed Enhanced Night Vision in Eyeglass Form (ENVision), will be divided to solve those problems in two separate technical areas. The five organizations selected in the first technical area will try to reduce the size and volume of night vision goggles. DARPA chose Raytheon Technologies Research Center; SRI International, University of California-San Diego, University of Washington and Physical Sciences, Inc., for that job.
The second technical category will use new technologies and processes to convert and amplify infrared to visible light. The agency selected Raytheon BBN, Stanford University, University of Central Florida, University of Melbourne and University of Pennsylvania.
“These efforts could potentially lead to all-optical night vision systems in the future without the need for image intensifiers,” said Rohith Chandrasekar, ENVision program manager in DARPA’s Defense Sciences Office. (Source: Breaking Defense.com)
13 Jan 22. US Navy Trains to Counter Drone Threats at Point Mugu. The Pacific Target Marine Operations (PTMO), a division Naval Air Warfare Center Weapons Division’s Threat/Target Systems Department, recently deployed small-drones over Naval Base Ventura County (NBVC), Point Mugu to provide cost-effective unmanned aerial system (UAS) familiarization and threat training.
“The Low Speed Aerial Target- Small (LSAT-S) program developed a cost-effective target training and deployment program that directly represents the UAS threat the fleet faces daily,” said Pete Pena, PTMO Program Lead. “UAS are classified by their size, range, and speed, and are broken into five groups based on those attributes. We’re flying group 1 drones which are considered to be the greatest threat to military forces across the globe due to their unique range of capabilities as well as their relatively low cost and small size.”
In 2021, speaking at a U.S. Senate Committee, Gen. Kenneth McKenzie, commander, U.S. Central Command referred to the proliferation of small drones as the “most concerning tactical development” since the emergence of improvised explosive devices.
Groups 1-3 can range from over-the-counter handheld drones to medium sized drones with sensors and the capacity to deliver weaponized payloads. However, the main threat that comes from groups 1-3 is intelligence, surveillance, and reconnaissance (ISR). These drones can be difficult to detect and destroy due to their low flying altitude and small size.
“Point Mugu is a no drone zone,” said Fire Controlman 1st Class Petty Officer Michael Jordan assigned to NBVC. “It is difficult to obtain authorization to operate drones in this controlled airspace, even for military units; so, this demonstration provided a rare opportunity for watch standers to experience live drone flights and provide identification, which is the first step in countering threats.”
In 2019, Ellen Lord, the former undersecretary of defense for acquisition and sustainment, established a waiver system to authorize drone operations on military ranges in highly controlled conditions, to test the U.S. military’s counter-UAS capabilities.
Civilian and military operators had a chance to fly multiple different scenarios onboard Point Mugu, Pena added. Each test presented a range of conditions, spanning from the direction a UAS was flying to a variance in flight patterns, altitudes, airspeeds, and representative threats.
“This demo is a force multiplier which allows us to offer more frequent and robust counter-UAS presentations to the fleet and installation commanders,” said Cmdr. Todd “Jazz Hands” Faurot, LSAT-S pilot. “This increases our defenses during peacetime and also providing for a war time surge capability.”
The first step in countering the rising threat from UAS is target acquisition and identification. The proliferation of UAS (especially group 1-3), the downsizing of the technology, and its decreasing costs of production will make threat detection difficult.
“Our demonstrations provide the fleet with important UAS familiarization and training to face this increasing airborne threat,” added Pena.
NBVC is comprised of three distinct operational facilities: Point Mugu, Port Hueneme and San Nicolas Island. It is Ventura County’s largest employer and protects Southern California’s largest coastal wetlands through its award-winning environmental program. (Source: ASD Network)
Blighter Surveillance Systems is a world-leading designer and manufacturer of best-in-class electronic-scanning ground-based radars, surveillance solutions and Counter-UAS systems. Blighter’s solid-state micro-Doppler products are deployed in more than 35 countries across the globe, delivering consistent all-weather security protection and wide area surveillance along borders, coastlines, at military bases and across critical infrastructure such as airports, oil and gas facilities and palaces. Blighter radars are also used to protect manoeuvre force missions when deployed on military land vehicles and trailers, and its world-beating multi-mode radar represents a great leap in threat detection technology and affordability for use in a variety of scenarios.
The Blighter range of radar products are used for detecting a variety of threats, from individuals on foot to land vehicles, boats, drones and low-flying aircraft at ranges of up to 32 km. Blighter Surveillance Systems employs 40 people and is located near Cambridge, UK, where it designs, produces and markets its range of unique patented solid-state radars. Blighter prides itself on being an engineer-led business committed to providing cost-effective and flexible solutions across the defence, critical infrastructure and national security markets.