Qioptiq logo Raytheon Global MilSatCom

NEW TECHNOLOGIES, AVIONICS AND SOFTWARE

Sponsored By Oxley Developments

www.oxleygroup.com

————————————————————————

15 Oct 20. Department of Defense Announces $197.2m for Microelectronics. Today, the Department of Defense announced it has awarded over $197m to advance microelectronics technology and strengthen the American microelectronics industrial base. This critical industry will underpin the development of other Department of Defense technology priorities such as artificial intelligence, 5G communications, quantum computing, and autonomous vehicles.

“The microelectronics industry is at the root of our nation’s economic strength, national security, and technological standing. Today’s awards support the Department’s mission to promote microelectronics supply chain security and accelerate U.S. development of the very best in circuit design, manufacturing, and packaging. It’s critical for the DOD and American industry to work together in meaningful partnerships to ensure the United States leads the world in microelectronics far into the future,” said Michael Kratsios, Acting Under Secretary of Defense for Research and Engineering.

The nearly $200m will be issued through two DOD programs: the Rapid Assured Microelectronics Prototypes (RAMP) using Advanced Commercial Capabilities Project Phase 1 Other Transaction Award, and the State-of-the-Art Heterogeneous Integration Prototype (SHIP) Program Phase 2 Other Transaction Award.

The RAMP Phase 1 Other Transaction awards totaling $24.5m will be awarded to Microsoft and IBM to advance commercial leading-edge microelectronics physical “back-end” design methods with measurable security.

The SHIP Phase 2 Other Transaction awards totaling $172.7m will be awarded to Intel Federal and Qorvo to develop and demonstrate a novel approach towards measurably secure, heterogeneous integration and test of advanced packaging solutions.

These awards highlight how the Department is moving towards a new quantifiable assurance strategy that will help the DOD quickly and safely build and deploy leading-edge microelectronics technologies. This is a departure from the previous model of security that severely limited our ability to work with leading-edge firms, and demonstrates the Department’s forward-looking approach to promoting security.

The RAMP and SHIP awards also underscore the Department’s commitment to accelerating the nation’s rapid pace of microelectronics innovation through the development of novel manufacturing capabilities and new architectures for chips. (Source: US DoD)

14 Oct 20. US Army Seeks Open Architecture For All Air & Ground Systems: Jette. Starting with a summit this December, the Army’s acquisition chief wants to harmonize technical standards across ground vehicles, aircraft, and other systems so they can easily use common components in their design – potentially saving millions – and share tactical data in battle – potentially saving lives.

Imagine a tank that can detect it’s being shot at and automatically alert other tanks in its formation, its defensive sensors feeding the threat location directly to the others’ weapons so they can fire back, the Army’s senior acquisition executive says. Or, Bruce Jette continues, imagine a laser weapon that automatically checks with friendly drones and aircraft to make sure they’re out of the path of the beam before it starts zapping an incoming missile.

But that kind of machine-to-machine exchange of data in near-real-time can only happen if all the vehicles and aircraft involved use compatible electronics, Jette told me. That’s not the case today – but Jette’s taking on the task of making it happen on future upgrades and new systems, at least across the Army. The Optionally Manned Fighting Vehicle to replace the M2 Bradley, he says, will be an early test case for the new approach.

There’s already been substantial progress that Jette seeks to build on, he said in an interview during the 2020 AUSA conference. For years, the military and industry have labored to define common technical standards for specific areas, like FACE for avionics and VICTORY for electronics on ground vehicles. This December, Jette told me, the Army will start trying to harmonize all those different standards together to form one meta-standard, a common Modular Open Systems Architecture for everything the service builds.

UPDATE Jette’s staff provided a list of the major existing architectures they’re seeking to harmonize:

  • Army Common Operating Environment (COE)
  • C5ISR/EW Modular Open Suite of Standards (CMOSS)
  • Future Air-Borne Capability Environment (FACE)
  • Integrated Sensor Architecture (ISA)
  • Sensor Open Systems Architecture (SOSA)
  • Vehicle Integration for C4ISR/EW Interoperability (VICTORY)

If Jette succeeds in unifying these different standards – and it’ll be far from easy – the resulting architecture could govern every contractor, official, and engineer trying to develop new weapons or upgrade existing ones. Getting a wider variety of weapons systems to use compatible electronics would also simplify the automated machine-to-machine exchange of data that is fundamental to a future Joint All Domain Command & Control network.

Jette’s goal is to define both “subsystems” – common components, such as radios, computers, even humble connector cables – that can go wide range of air and ground vehicles – and “super-systems”—ways for different vehicles to share data. Instead of focusing on each individual vehicle, aircraft, or other system as a self-contained box, each with its own uniquely customized components, the new emphasis is on what internal components and overarching connections those platforms have in common.

And instead of locking down a design, mass-producing it, and only then thinking about potential upgrades, Jette said, each ground or air vehicle will be able to constantly evolve. This, he said, will allow “continuous development” in which it’s easy to swap in new and improved components, as long as they meet the standards set by the open architecture. It’s like swapping one Lego piece for another: They may be different shapes and colors, but they still click together — in an infinite variety of shapes.

Standards Summit In December, Industry Input In 2021

“We’re going to have the first meeting in December,” Jette told me. That summit will involve Army officials and experts working on different open-architecture standards, he said. That will set up ongoing meetings and a standing committee to start thrashing out how to make those existing standards more compatible.

Then in January or February, Jette continued, he’ll start bringing in representatives from the defense industry. “It’s not just going to be the Army saying what we want and then telling industry,” he emphasized. “Industry needs to be a participant in the process of defining what those standards are.”

Jette’s already had informal discussions with three CEOs of major defense companies, he told me, and they reacted positively.

But, I asked, why should they? After all, the time-honored business model for many contractors is to sell the government proprietary technology that none of their competitors can work on, which gives whomever sold the original product a de facto monopoly on maintenance, sustainment, and future upgrades. Open architecture is appealing for government officials, because they are no longer locked into one vendor but can look to multiple competitors for any given component, as long as all competitors meet the common standard. That eases upgrades and lowers cost to the government. But what’s in it for industry?

Strong companies see open architecture as an opportunity to make more sales, Jette replied. Today, if you sell a component to the Army for one particular program, it probably won’t work in any other program without modification. Maybe the targeting data will need to be output in a different format, for example, or the cable will need a different number of pins to plug in. But if multiple programs use the same open architecture standards, you can sell the same thing to any and all of them, and it should just plug-and-play. If you think your technology is best-in-class, that dramatically increases the number of competitions you can enter and win.

“If it’s an open system,” Jette told me, “your marketplace just blossomed.” (Source: Breaking Defense.com)

14 Oct 20. This intelligence agency has found new ways to find magnetic north. The National Geospatial-Intelligence Agency has announced the winners of a $2.1m competition to find new and better ways to measure the Earth’s magnetic field.

Traditionally, gathering data about the Earth’s magnetic poles has been a challenge, partly because they do not align with the geographic poles placed on maps and partly because they are constantly moving. Knowing where the magnetic poles are in relation to those geographic poles is essential for accurate navigation, which is why NGA and the United Kingdom’s Defense Geographic Centre maintain the World Magnetic Map (WMM)―a periodically updated reference that allows users all over the world to access the most up to date information about the poles. The model is embedded in thousands of military systems, including submarines, satellites and aircraft, enabling them to accurately navigate all over the world using the Earth’s own magnetic field.

As the intelligence agency charged with mapping the world, NGA launched MagQuest in March 2019 to find new ways to collect data for the WMM.

“The importance of the World Magnetic Model cannot be overstated: It is critical to keeping everyday navigational systems running,” said NGA Senior GEOINT Authority J.N. Markiel in a statement. “MagQuest has advanced scientific and technical innovations that could be key to the future of geomagnetic data collection, specifically approaches that are both sustainable and scalable.”

The first prize winner is Iota Technology’s Satellite Instrumentation for Geomagnetic Analysis (SIGMA). Working with teams from Oxford Space Systems and AAC Clyde Space, Iota Technology’s solution uses a 3U CubeSat with a deployable boom and digital magnetometers to collect data.

In addition to the $350,000 awarded to first place winner, NGA awarded $255,000 prizes to two second place winners and $50,000 to two runners up. The cash prize competition drew interest from both academia and industry, with participants putting together space, aerial, oceanic, and land-based approaches to the problem.

NGA said in a statement that the MagQuest results will inform how the agency collects geomagnetic data for WMM in the future, with an expected procurement in 2027.

“We are thrilled by how open innovation has allowed us to bring pioneering ideas to life and spur novel technological advancements,” said Markiel in a statement. “The competition has not only driven new approaches to geomagnetic data collection, but also shows how open innovation can be used to support the future of critical scientific infrastructure.” (Source: C4ISR & Networks)

15 Oct 20. Fixposition Launches Sensor for Drones and Robots. Fixposition, a spin-off from ETH Zurich (ETH), has developed a new sensor allowing robots and drones to navigate even the most challenging environments.

Fixposition has launched the Vision-RTK, a centimeter-accurate positioning solution that uses a unique sensor fusion algorithm to integrate GNSS, camera and inertial sensors. The Vision-RTK is designed to provide high-reliability positioning for drones and robotics in challenging environments where Global Navigation Satellite System (GNSS) signals may be limited or non-existent.

GNSS-based navigation methods suffer from limited reliability in areas where satellite signals may be blocked or degraded. Standalone computer vision technologies are sensitive to light conditions (e.g., snow, strong sunlight, rain, etc.) and struggle in situations where there is a lack of features, such as cornfields and grass. Standalone inertial errors are prone to errors that accumulate over time, causing large drifts.

Solutions based on only one of the above methods are limited in their range of operation and are likely to fail in certain conditions. Fixposition’s solution increases the potential of these sensors, using unique sensor fusion technology to increase both positioning accuracy and the ability to operate in different environments.

The Vision-RTK’s real-time sensor fusion provides centimeter-accurate absolute positioning at any time, in any outdoor environment. The compact and lightweight module, incorporating two RTK-GNSS receivers and visual inertial navigation, is ideal for SWaP (size, weight and power)-constrained applications that still require high accuracy. The dual-receiver configuration, together with Fixposition’s advanced algorithms, provides a true-heading output and increased resistance to electromagnetic radiation.

The system features USB and Wi-Fi connectivity and an easy-to-use web interface, allowing easy integration into state-of-the-art autopilot control systems for UAVs (unmanned aerial vehicles), UGVs (unmanned ground vehicles) and other robots. It is plug-and-play compatible with platforms such as PX4, ROS and Apollo.

Vision-RTK increases device uptime and efficiency while enabling new applications in previously inaccessible market segments. For end-device manufacturers, this technology eliminates the need for in-house development of advanced localization solutions, shortening sales cycles and reducing costs and investments.

Dr. Zhenzhong Su, CEO and co-founder of Fixposition, commented:

“With Vision-RTK, Fixposition offers the only compact solution for positioning and navigation in challenging outdoor environments, enabling applications in previously unreachable areas and opening a whole new world of possibilities for autonomous ground robots and drones.”

Lukas Meier, CTO and co-founder of Fixposition, stated:

“The highly integrated nature of our Vision-RTK sensor and our deep expertise in computer vision and Real-Time Kinematic (RTK) GNSS enabled us to implement completely new approaches in sensor fusion resulting in previously unseen performance.” (Source: UAS VISION)

15 Oct 20. MASA Group Selected by the University of Defense in Brno. MASA Group SA (MASA), a market-leading developer of Artificial Intelligence (AI)-based Modeling & Simulation software for the defense, emergency preparedness, serious games, and game development markets, has been selected by the University of Defense in Brno, Czech Republic, to supply their  flagship product, MASA SWORD.

The university principally intends to use SWORD for operational research purposes in the fields  of war-gaming, analysis and experimentation.

Before selecting SWORD the University of Defense evaluated the offers of several companies. MASA’s flagship solution was selected for its sophisticated experimental module, and its ability to simulate complex systems and behaviors, excelling at low, medium, and high operational levels. Thanks to SWORD’s advanced AI capabilities it is possible to simulate complex operations in accelerated real-time with very few operators, or, quite remarkably, on occasions, none at all. Several courses of action can be prepared and executed in parallel, and furthermore, the simulation can be queried for evaluation data at any point in time during their execution. The full gamut of post-exercise analysis is evidently also provided.

Having followed a comprehensive training program to achieve a high-level of proficiency with SWORD, their teams will customize the default content delivered with the product, bringing it into line with Czech military doctrine. This work will result in an entirely localized version of SWORD, containing Czech units, equipment, terrain, urban environments, and vital infrastructure.

Dean Colonel Ing. Jan Drozd Ph.D. says: “One of the things in experimentation that is an absolute must is to be able to define individual Unmanned Autonomous Vehicles, Robots both for land and air missions, and swarms of robots. It is widely accepted that the use of this type of equipment will become part of our job. SWORD also allows us to gain a better insight into urban warfare. We are looking forward to the training program, and we hope that SWORD will become our go-to tool for science and research.”

MASA GROUP’s CEO Marc de Fritsch states: “Working in close collaboration with this prestigious institute is a privilege for us and we feel that SWORD will serve Univerzity obrany extremely well. The University of Defense is the sixth organization to use SWORD for operations research. Current users are Parsons, DGA France, DSTA Singapore, DSTG Australia and two others that we are not permitted to name. This is proof of the capability of SWORD to simulate courses of action applied in research and operational environments.”

The Vice-Dean for Research Maj. Assos. Prof. Karel Šilinger, PhD. stated that “MASA SWORD has the potential to connect science and research activities carried out at the individual departments of the University of Defense and in the future the same could apply to other military schools and research institutions abroad”. In the hands of the Czech Republic Army, he expects that “the simulator could be used to gauge the value of equipment, devices and systems , prior to purchase”.

SWORD is an automated, aggregated, constructive simulation designed to improve training, analysis, and decision support for both commanding officers in the military and crisis managers in the emergency preparedness sector. SWORD offers users highly realistic scenarios as well as an open simulation platform which is interoperable with other simulations and Command & Control (C2) systems. It can be deployed via remote servers in the Cloud and is available in several languages (recent versions include English, French, Spanish, Brazilian Portuguese and Arabic). SWORD also includes Direct AI, a proprietary cutting-edge Artificial Intelligence engine developed by MASA.

14 Oct 20. Thales and NEHS DIGITAL join forces on French defence ministry project to use AI to fight COVID-19.

  • AI-based solution will analyse lung scan imagery to provide initial diagnosis for healthcare professionals.
  • Developed for a French defence ministry call for projects launched by the Defence Innovation Agency with a 668,000 euros funding, the solution will be ready for deployment in the autumn of 2020.

The COVID-19 pandemic has put huge strain on healthcare services around the world, and this situation will likely continue for many months in some countries. To prevent hospital services becoming overstretched and better prepare for dealing with this kind of crisis in the future, the French Society of Radiology (Société Française de Radiologie – SFR) recommends CT (computed tomography) scans combined with mobile X-ray to diagnose COVID-19. These scans are performed by a radiologist, who has to interpret multiple images for each patient, often within a short timeframe, in order to assess lung damage.

Thales and NEHS DIGITAL, both major players in the French healthcare system, have joined forces to speed up the hospital admissions process in a crisis by using new technology. This project will leverage Thales’s expertise in radiology and artificial intelligence (AI) and NEHS DIGITAL’s extensive footprint in the French hospital system. NEHS DIGITAL is a subsidiary of nehs group.

The partners will develop a solution in record time using AI to analyse CT imagery of the chest and lungs. As soon as the images are taken, the AI service will make an initial recommendation, including a preliminary diagnosis and assessment of the level of criticality of any lung damage, so that medical staff can admit the patient to the appropriate ward and prioritise the most urgent cases. The project was selected after a call for projects launched by the French Ministry for the Armed Forces to combat COVID-19.

Databases of medical imagery relating to the COVID-19 epidemic are virtually non-existent today. NEHS DIGITAL, in partnership with the SFR, has now begun collecting anonymised chest scan data from around 100 hospitals in France as part of the FIDAC (French Imaging Database Against Coronavirus) project. To analyse these datasets, which must be as large as possible to ensure they are representative, Thales will set up an infrastructure to train the necessary algorithms using machine learning and develop an AI system that can make automated recommendations to healthcare professionals.

Deployment of the AI service developed by Thales will make use of NEHS DIGITAL’s extensive cybersecure telemedicine infrastructure  Hosted on a secure cloud. The partners aim to deploy an initial demonstrator of this solution over the next three months using NEHS DIGITAL’s existing platform, which is already used by almost all French hospitals. The solution will be continuously updated as the number of available images increases.

A medical committee will validate the solution and monitor rollout in hospitals.

This large-scale project is a first step towards more widespread use of AI to support the work of radiologists. Thanks to this collaborative approach by multiple stakeholders, and with the architecture already in place, the AI service could also help to diagnose other pathologies in the future.

13 Oct 20. F-35 jet’s problematic lightning protection system set to receive fix. By the end of 2020, F-35 fighter jets rolling off Lockheed Martin’s production line will be equipped with a modified lightning protection system that will fix problems discovered earlier this year, the company’s head of production said.

In June, the government’s F-35 Joint Program Office imposed flight restrictions on the F-35A conventional-takeoff-and-landing variant — the model used by the U.S. Air Force and most international customers — after the Air Force discovered an issue with the Onboard Inert Gas Generation System.

OBIGGS allows the jet to safely fly in conditions where lightning is present by pumping nitrogen-enriched air into the fuel tanks to inert them, preventing the aircraft from exploding if it is struck by lightning. However, maintainers at Hill Air Force Base’s Ogden Logistics Complex in Utah found damage to one of the tubes that distributes inert gas into the fuel tank, increasing the risk that the system may not function as designed.

While flight restrictions are still in effect, the Defense Department and Lockheed have come to an agreement on a fix for the OBIGGS system, Darren Sekiguchi, Lockheed’s vice president of F-35 production, told Defense News in a Oct. 5 interview.

The fix primarily involves “strengthening a number of brackets associated with these tubes for OBIGGS,” Sekiguchi said, which will ultimately allow the tubes inside the fuel tank to be held in place more securely and prevent movement that could lead to damage.

“Changes certainly have been incorporated in the production line already,” he said, adding that the first aircraft with the modified system will be ready for delivery by the end of 2020.

While a corrective action has been put in place for new F-35s, the Defense Department and Lockheed are still discussing how best to implement a fix for jets already fielded by the Air Force. The final schedule for retrofitting the jets will ultimately be determined by the service’s availability, but “doing the modifications in the field will likely take several years,” Sekiguchi said.

The ongoing negotiations will also determine whether Lockheed takes financial responsibility for the problem.

The F-35 Joint Program Office did not respond to a request for comment by press time.

Under the current flight restrictions, F-35As are prohibited from flying within 25 miles of lightning or thunderstorms — a practice that is typical for most flight training.

A June 5 memo by the government’s F-35 program office noted that damaged OBIGGS tubes had been documented in 14 of the 24 “A” models inspected at that time. Lockheed Martin paused F-35 deliveries for three weeks in early June to allow the company to validate whether it was properly installing OBIGGS systems, which are manufactured by BAE Systems. F-35 deliveries resumed later that month.

Solving the OBIGGS problem will put Lockheed one step closer to normalizing F-35 production after a challenging year. The company began ramping up production again in September after a three-month slowdown caused by the COVID-19 pandemic.

From May 23 to Sept. 4, Lockheed implemented an adjusted production schedule where the 2,500 employees at the Fort Worth, Texas-based line worked two weeks before having a week off. The slowdown was necessary, executives said, due to delays across the F-35′s global supply chain, which includes more than 1,900 vendors.

Lockheed now projects it will deliver 121 F-35s by the end of 2020 — 20 jets short of the 141 originally forecast this year.

During the slowdown this summer, F-35 production crawled to about eight to 10 jets a month, Sekiguchi said. While the goal is to gradually increase the rate to up to 14 F-35s delivered per month, he noted that some of Lockheed’s suppliers are still facing disruptions as the novel coronavirus sweeps across the globe.

“When you get to a point where you have a larger [COVID-19] breakout either in a critical skilled workforce, or that requires a quarantine of a larger workforce, that delay becomes more of a challenge,” he said. “We see it across [the board], from the detailed component suppliers all the way up through major assembly.”

During the beginning of the pandemic, the F-35′s international suppliers — particularly in Europe — saw the biggest production headwinds as countries locked down to try to get control the transmission of the virus. American suppliers grappled with delays shortly afterward, as the pandemic began to spread through American cities and towns.

“Right now, although it has quieted somewhat back down, we do see pop-ups that are across the supply chain and affect us globally,” Sekiguchi said, noting that the company is watching for signs of a second coronavirus spike.

“We are seeing upticks, especially here following the Labor Day weekend, across the U.S. We’re keeping a close eye on those right now. Our plan really is, I would say, now that we’ve operated through the initial spike, should we need to implement any of those types of precautions, we can again.” (Source: Defense News)

13 Oct 20. W. L. Gore & Associates announced today a new solution that meets higher voltages as the trend continues towards aircraft electrification to reduce the impact of air transport on our environment. GORE® High Performance Aerospace Wires, GWN3000 Series, deliver the best combination of superior mechanical strength and outstanding electrical reliability without increasing wire bundle size or weight.

Ensure System Reliability over Time

GORE® High Performance Aerospace Wires, GWN3000 Series, provide a higher level of mechanical and electrical performance over time in current and next-gen aircraft. They ensure EWIS (electrical wire interconnection systems) reliability, increase operational readiness, improve safety and reduce total lifecycle costs.

The GWN3000 Series meets and even exceeds new rigorous industry standards that require higher levels of electrical and mechanical durability for wire bundles operating in extreme aerospace environments.

“We’re excited to offer a single, long-term solution to the industry that can solve complex EWIS challenges in today’s and next-gen aircraft,” said Jim Carothers, product manager.

New-generation engines and aerodynamic optimization have lowered fuel burn during the past half century of jet-age travel. However, recent efficiency and environmental gains are incremental compared to the paradigm shift inherent in an eventual transition to all-electric aircraft. That revolution and the near-term steps required to achieve it rely not just on new battery technology, but also the ability to deliver power sufficient to replace current systems based on combustion, hydraulics or pneumatics.

Gore’s unique, chemically inert, wire insulation does not degrade after exposure to harsh chemicals or humidity. The GWN3000 Series also reduces the risk of chafing, abrasion, and cut-through failures while easily tolerating wide temperature ranges. This new unrivaled wire insulation meets mechanical, electrical, and material stability needs in one solution for complex wiring systems in commercial and defense aircraft.

For more information about the GWN3000 Series of GORE® High Performance Aerospace Wires for commercial and defense aircraft, visit gore.com/highperformancewires or contact a Gore representative.

Performance Solutions

Gore develops products and technologies that address complex product and process challenges in a variety of markets and industries, including aerospace, automotive, pharmaceutical, mobile electronics and more. Through close collaboration with industry leaders across the globe, Gore enables customers to design their products and processes to be safer, cleaner, more productive, reliable, durable and efficient across a wide range of demanding environments.

12 Oct 20. General Atomics Aeronautical Systems, Inc. (GA-ASI) New SC2 Software Offers Massive Savings For Gray Eagle ER Teams.

Reduces Emplacement, Mission Launch Time and Ground Segment Footprint.

Massive reductions in emplacement, mission launch time and overall footprint size are in the works for Gray Eagle Extended Range (GE-ER) Unmanned Aircraft Systems (UAS), thanks to new laptop-based interface called Scalable Command & Control (SC2) from General Atomics Aeronautical Systems, Inc. (GA-ASI).

GA-ASI recently concluded a series of flight tests using the new SC2 interface hosted on the U.S. Government’s Improved Portable Maintenance Aid (IPMA) that is fielded to all UAS units in the Army. This provides 100 percent of the functionalities of the Ground Control Station (GCS) shelter hosted on a laptop, greatly reducing the logistics burden of set-up, transporting and operating a Gray Eagle UAS. The effort was closely coordinated with GA-ASI’s Government customer to fully control a company-owned Gray Eagle Extended Range (GE-ER) UAS including pre-flight, taxi, takeoff and landing.

The operation was later conducted using a Government-owned GE-ER to confirm functionality with the field GCS software. The SC2 software enables control of the GE-ER and its payloads, while also allowing aircraft, payloads and sensors to be controlled by disparate users replicating a ground maneuver force or other disadvantaged user. SC2 also efficiently controlled the onboard sensors and commanded release of various payloads from disparate manufacturers all integrated in less than 90 days.

“SC2 incorporated significant automation and cognitive workload reduction for GE-ER operators, allowing them to focus on mission tasks,” said GA-ASI President David R. Alexander. “SC2’s pre-flight automation reduces emplacement and mission launch timelines by 75 percent from the currently-fielded Ground Control Station (GCS).”

SC2 is a collection of standalone software applications that reduce operator workload through automated check lists and optimizes the operator steps for pre-flight, taxi, launch and recovery, health and status monitoring, sensor and payload control and maintenance of the Gray Eagle UAS. GA-ASI believes SC2’s automation will allow enlisted operators to fully focus on the more difficult and operationally relevant mission tasks, leaving the more mundane tasks to the software with minimal man-in-the-loop tasks to meet the Army concept of “supervised autonomy.”

SC2 leverages previous automation tools and government approved architectures to reduce overhead and integration and sustainment costs. SC2 software incorporates GE-ER-specific capabilities needed for conducting everything from simple maintenance checks to full mission operations and flight testing. General Atomics is working closely with the Army to ensure SC2’s open architecture is aligned with the Army’s plans for Scalable Control Interface (SCI).

Incorporating SC2 into the fielded GE-ER configuration will allow the Army to significantly reduce the logistical footprint of a GE-ER platoon and provide a true expeditionary capability to the Army while enabling flexibility in the conduct of Multi-Domain Operations (MDO), providing situational awareness, critical and timely long range targeting information and enabling future vertical lift aircraft to focus on their missions. The vision is that any laptop computer hosting the SC2 software with a datalink interface (line of sight or beyond line of sight) will be able to interface with the GE-ER aircraft and control onboard systems.

“This capability will eliminate over 100,000 pounds of Army vehicles in each GE-ER platoon, providing maximum flexibility to unit commanders on the MDO battlefield,” said Alexander.

07 Oct 20. EASA invites comments on proposals for BVLOS operations in an urban environment. The European Aviation Safety Agency (EASA) invites comments on the overall concept presented at the webinar hosted by the agency on 1 October on beyond line of visual sight (BVLOS) operations in an urban environment.

according to the current EASA timetable the Agency plans to:

  • Receive comments on this proposal by NAAs by 9 October
  • Review the changes proposed to EASA SORA with JARUS WG6 by end of October
  • Publish by the end of 2020 the Cert Memo/DOARI/GM or similar regulatory material to bridge time until adoption of new regulation
  • Publish by end of 2020/beginning 2021 a Decision with amendments to EASA SORA
  • Have a focused consultation with all affected stakeholders on the proposal of the new regulation on AW of certified UAS in medium risk in the specific category by Q1 2021
  • Publish the new regulation on AW of UAS in medium risk in the specific category by Q3 2021

A summary of the principal points raised and discussed during the webinar is available on the UVS International website, the association representing manufacturers and operators of remotely piloted systems. The main points are summarised as follows:

Population density

  • For the purpose of developing a map to identify the population density, EASA plans to launch a dedicated study;
  • Instead of static maps, EASA is looking to develop a technology to establish the real density on the exact moment. The study will define the most appropriate solution to achieve this objective. The study will also allow to develop a clear definition of ‘‘populated areas’’. The study will address the whole territory of Europe;
  • It has not yet been decided who will conduct the study, EASA is currently looking for resources to finance this activity.

EU (through EASA) is competent to assess the design of an aircraft (including drones)

  • EASA is the competent authority in the EU for the verification of the design of aircraft (including drones). EASA may in the future consider to use the support of other entities (e.g. Qualified Entity or NAA). However it is too early at this stage to discuss this;
  • NAAs are the competent authorities for assessing operational and pilot competency aspects. Also in this case, NAA may be supported by Qualified Entities (QE) or recognized entities (e.g. consulting companies).

High robustness for SAIL V and VI

  • EASA believes that there is an inconsistency between the level of robustness required for OSO 4 (design) and OSO 5 (reliability). For SAIL V OSO 4 is “M” and OSO 5 is “H”. The reliability cannot be verified without the verification of the design. EASA intends to modify the robustness of OSO 4 as described in the presentation; however it will introduce the flexibility to use standards for experimental operations. EASA will discuss this with JARUS WG 6;
  • UAS to be operated in high risk operations in the specific category will be required to be certified according to Part 21.

JARUS SORA developments

  • When addressing the risk of collision when more than one UAS are flying in the same airspace (e.g. urban), SORA currently considers only one operation of an UAS, the JARUS WG 6 is now working on Annex G to SORA to tackle that drone-to-drone collision risk. However, EASA considers that in the first phase, the number of UAS operations will not be too high;
  • JARUS is also working on cybersecurity issues. A new document on cybersecurity will shortly be out for consultation.

Light certification process for «medium assurance operations»

  • EASA offers the NAA the possibility to mandate (within the operational authorisations) that UAS operators use certified drones when conducting operations in the medium risk (SAIL III and IV). In that case, the liability (in case of technical failure) changes from the UAS operator to the manufacturer. In this case, the EASA certificate will cover all OSOs related to design [OSOs 2, 4, 5, 6, 10, 12, 18, 19 (limited to criteria 3), 20, 24], and the NAA will verify compliance only for the remaining OSOs;
  • Manufacturers may also apply directly to EASA for a certification. Some manufacturers will find it beneficiary to protect proprietary information;
  • In case a NAA does not want to require a certified UAS, the NAA will accept a declaration covering also the design related OSOs from the operator. The declaration will have to be signed by the UAS operator, who will declare the compliance to all OSOs and will bear the liability;
  • A regulation defining a simplified certification process (inspired by the light certification process defined in the new Part 21 Light, Part CAMO and Part ML with additional simplifications) will be developed.

CE marking: When is it mandatory?

To clarify this point, it is better to explain how the design of a drone will be assessed in the different categories.

  • In the certified category, and the specific category high risk, only drones with a (restricted) TC issued by EASA can be used;
  • In the specific category low/medium level of risk, there is some flexibility:
  • If the operation is covered by a standard scenario (STS), the CE class label marking is mandatory;
  • For all other operations the following 3 options are possible: – The operator includes in the application for authorisation to be sent to the NAA also the declaration of compliance of the drone with the design related OSOs; – The NAA mandates that for some operations only drones with a TC issued by EASA can be used; – Manufacturer voluntarily applies for a (R)TC and puts a drone with the (R)TC on the market. In that case, the UAS operator will not need to declare compliance to the design related OSOs.
  • With regards to PDRAs (predefined risk assessments), these still require an authorisation by the NAA. The PDRA already includes the package that a UAS operator needs to deliver to the NAA in order to get the authorisation; the purpose of PDRA is to facilitate the process for the UAS operator, so the authorisation process will again follow one of the 3 options above;
  • EASA is in the process to publish by the end of this year two new PDRAs mirroring the STSs (the only difference with STS will be that a drone with CE class mark will not be required) and an additional PDRA for BVLOS operations in sparsely populated area in reserved airspace.

Experimental flights under the Specific category

  • The authorisation process will be the same as per ‘normal’ operations: UAS operator will apply to the competent authority to show compliance with requirements and depending on the SAIL and the evidence provided by the operator;
  • It is foreseen that the vast majority of this kind of operations will be conducted in a controlled area where the ground risk and the air risk is very limited. A provision will be included to have the flexibility to not use any standard for design;
  • UAS operators are invited to develop PDRAs for the purpose of test flights.

JARUS SORA developments

  • JARUS WG 6 is already working to expand the scope of SORA to address the risk of collision when more UAS are flying in the same airspace (e.g. urban): for now SORA considers only one operation of an UAS; however, we consider that in the first phase, the number of UAS operations will not be too high;
  • JARUS is also working on cybersecurity issues – the new Annex on cybersecurity will be out for consultation in a few weeks;
  • Annex F was published in April for WG6 internal consultation. JARUS is also addressing ground to ground risk, swarm of drones. In few weeks some documents should be issued.

Discussing changes to SORA with JARUS

  • EASA will discuss with JARUS the proposed changes to SORA;
  • Even if discussed and agreed with JARUS, JARUS may not update and publish SORA by the end of this year. In EU it is more urgent and we have different timeline, as we included SORA in our regulatory framework.

Way forward to ensure a uniform & consistent implementation of medium risk operations in the specific category

  • EASA is conducting weekly meetings with NAAs to understand their needs. We also plan to establish a common repository to share the best practices and then to decide which ones are the most effective. Based on this, EASA may then later on be able to accommodate some changes in the regulation (if needed);
  • EASA will monitor and assure that the approach is consistent. With the new approach,
  • manufacturers will have a proportionate tool to make their product available on the market. With regards to reliability, it will be beneficial for both NAAs and manufacturers to follow the approach.

EASA invites comments on the overall concept presented and the points mentioned above. Comments are to be sent to Barbara Zygala (barbara.zygula@easa.europa.eu) and should be received prior to 14th of October 2020.

Link to the EASA presentation: https://rpas-regulations.com/wp-content/uploads/2020/10/EASA_Presentation_Operations-Medium-Risk-Specific-Category_201001_TR.pdf

For more information visit:

https://rpas-regulations.com/wp-content/uploads/2020/10/201004_EASA_BVLOS-Ops-in-Urban-Environments_TR.pdf

(Source: www.unmannedairspace.info)

10 Oct 20. Industry Perspective: Developing Military Electronic Systems Calls For Holistic Strategy. One can view the Defense Department’s Digital Modernization Strategy as a direct response to this 2018 National Defense Strategy goal: “Prototyping and experimentation should be used prior to defining requirements and commercial off-the-shelf systems. Platform electronics and software must be designed for routine replacement instead of static configurations that last more than a decade.”

Within the strategy, one can argue that the Defense Department chief information officer’s priorities — cybersecurity, artificial intelligence, cloud, command, control and communications — were developed to help achieve the aforementioned National Defense Strategy goal.

And that goes for the digital modernization goals as well: innovate for competitive advantage; optimize for efficiencies and improved capabilities; evolve cybersecurity for an agile and resilient defense posture; and cultivate talent for a ready digital workforce.

In response, the services are executing initiatives to meet these priorities and achieve these goals.

All these approaches are based on software development. Specifically, they are based on “DevSecOps,” as is now being used with the software development approach. These are a set of practices that combine software development (Dev) and information-technology operations (Ops) with the aim to shorten the systems development lifecycle and provide continuous delivery with high software quality. When referenced as DevSecOps, the (Sec) acknowledges that for the Defense Department, security issues are of a paramount concern and must be addressed.

The use of DevOps has been a commercial best practice for years and it does address the department’s desire for agility. In truth, its adoption of DevOps is an excellent first step.

But there is a reason why this is a popular joke about DevOps:

Question: “How do DevOps engineers change a lightbulb?”

Answer: “They don’t. It’s a hardware problem.”

This joke highlights the wisdom of one of the popular quotes attributed to Alan Kay, the inventor of Smalltalk and the Alto, and the driving force behind Xerox PARC in 1982: “People who are really serious about software should make their own hardware.”

Fifteen years ago, the commercial electronics ecosystem was organized as it had been since the 1970s. There were “chip guys,” “software geeks” and “systems geniuses.” Each group worked hard to optimize their craft and great gains were made, and sins were masked thanks to the all-powerful Moore’s Law.

The limits of optimization began appearing somewhere between 2000 and 2005 with systems, software and single-core performance gains leveling off as predictions of power consumption causing rocket engine-level heat made waves in trade magazines. At that time, the first articles predicting the end of Moore’s Law and the crucial co-dependency between software and hardware were published, with Herb Sutter’s 2005 article in Dr. Dobbs Journal, “The Free Lunch Is Over” perhaps being the most prominent.

Sutter’s article stated that microprocessor serial-processing speed is reaching its physical limit, leading to two main consequences.

First, processor manufacturers would have to focus on products that better support multi-threading such as multi-core processors. Second, software developers would be forced to develop massively multi-threaded programs as a way to better use such processors. The “free lunch” — the constant improvement of hardware performance that made a software developer’s life easy — would come to an end.

Experts then predicted a new golden age of domain-specific architectures — custom hardware — and domain-specific languages: software optimized for the custom hardware.

The answer to ensure further gains was to optimize across the strata to support multi-core architectures. With this, the problem of power became the key driver.

In particular, the burgeoning smartphone market saw battery life as a key limitation to adding new capabilities and a major source of customer dissatisfaction. Additionally, consumers were showing an appetite for features such as web browsing that demanded more and more processing power.

This all required specialized hardware support: networking, video, multi-tasking, graphics, low power, audio, security and camera.

All of these elements had been available in separate products, but never brought together in a phone. Each of them had been highly optimized. However, to make a phone with all of these features, very different success optimization metrics needed to be applied. The metrics required a new methodology to optimize using a new cumulative objective function. To drive the cost function down, it became clear that the traditional serial strata of the system development process needed to be shattered and that the design chain had to be restructured.

For example, until the early 2000s, the design chain of embedded mobile systems was dominated by platform-based designs. The semiconductor vendors provided all drivers, and then interacted with software OS vendors, like Palm OS, Symbian, Microsoft Window CE and Pocket PC 2002 to port their operating systems to their silicon. They would then provide it jointly to device and equipment manufacturers. This situation has now been replaced by only two operating systems — iOS and Android — integrating all the required services as middleware in exchange to enable a much bigger ecosystem of apps that can be developed based on early representations of the hardware.

For example, think of the iOS and Android software development kits that are provided as pure software representations.

By taking on more responsibility for the hardware/software stack, a much bigger ecosystem of application developers has been unleashed. In exchange, however, the hardware abstraction layer, or HAL, of an Android device, for instance, must be architected and verified in a way that software development can start in parallel. This is what the industry today refers to as the “shift left.”

Demolishing the barriers between teams begins with eliminating the serial hardware-then-software process. Instead, software development has, at least partially, “shifted left” to overlap with hardware design. This change often makes the software team a little uneasy at first, as working on a “squishy” hardware platform is new ground.

But, as the software developers begin to recognize that they do not have to work around all of the hardware bugs anymore, they now have a choice as to whether to fix the problem at the hardware source or work around it in software. In fact, once this hardware/software design process is implemented, many software developers relish the ability to push the problems back to the hardware team.

Furthermore, to really co-optimize hardware and software, the industry is entering a phase of custom, configurable hardware with associated software. One clear example of this trend is the emergence of programmable, extendable processor architectures, as well as a resurgence of reconfigurable architecture that can switch algorithms within very small timeframes.

It is worth noting that while a first wave of reconfigurable architectures was introduced in the early 2000s with long-forgotten startups like Adaptive Silicon, Elixent, Triscend, Morphics, Chameleon Systems, Quicksilver Technology and MathStar, they are finding a revival now with defense-specific programs.

The key benefit in system optimization of the shift-left trend is the early visibility of system size, weight and power characteristics. As for functional bugs, shift-left enables a choice of where to draw boundaries, and subsequently move them. That is, hardware-software tradeoff optimization is enabled for performance, power, thermal and reliability.

Hardware emulation, a critical design automation technology required to make the shift-left a reality, can also often be combined with commercial virtual prototyping and “software-based emulation” based on open-source technologies like QEMU and VirtualBox.

All emulators are not equal. Accurate SWAP tradeoffs begin with accurate hardware representations. Register-transfer-level languages — such as VHDL, Verilog and SystemVerilog — can help with this. Hardware-software co-optimization methodologies require the accuracy of RTL hardware emulation.

To allow app development in a decoupled fashion, software-based emulation using technologies like QEMU and VirtualBox are often employed or provided in Android and iOS software development kits. The techniques that design teams choose to adopt depend on accuracy/performance/availability tradeoffs. Typically, the higher in the software stack the software to be developed resides, the more abstract the representations for development are.

The Defense Department’s adoption of DevOps is an excellent first step, but it can’t be the last. It’s very tempting for some within the department to consider the adaptation of DevOps to be the easy fix to address their electronic system development, sustainment and modernization issues.

However, while we always hope for the easy fix — the one simple change that will erase a problem in a stroke — we all know that few things in life work this way.

Instead, success requires making 100 small steps go right — one after the other, no slipups, no goofs, everyone pitching in. Furthermore, we know that to truly solve a problem, one must tackle the root cause, not the effect. Hence, the rationale for the famous Alan Kay quote, “People who are really serious about software should make their own hardware.”

To meet the intent of the 2018 National Defense Strategy goal for microelectronics, both hardware and software development must be addressed.

And the major lesson from the successful commercial electronics systems companies is this: If you really want good software for your system, you’ve got to have really good hardware that’s been developed right alongside your software.

James S.B. Chew is group director, Frank Schirrmeister is senior group director of solutions marketing, and Steve Carlson is director of aerospace and defense solutions at Cadence Design Systems. (Source: glstrade.com/National Defense)

09 Oct 20. Developing new battery technologies: apply for business funding. Businesses can apply for a share of up to £10m to support development of innovative battery technologies for electric vehicles. There is a growing global demand for new and more efficient batteries to support the switch to electric transport. This demand is driven by plans by the UK and other governments to ban the sale of conventional petrol and diesel vehicles within the next two decades.

The battery supply chain could be worth £12bn to the UK economy by 2025 if the country can establish itself as a global leader in battery technology.

The UK government’s Industrial Strategy Challenge Fund Faraday Battery Challenge has up to £317m to help businesses and researchers to develop market-leading battery technologies.

Innovate UK, as part of UK Research and Innovation, has up to £10m from the fund to invest in feasibility studies and in research and development into promising and innovative battery technologies.

Work could include battery cost, efficiency, and recycling

Projects can focus on a variety of improvements to battery technologies for the propulsion of electric vehicles. They could look at automotive applications or other sectors such as rail, marine, aerospace, defence, or off-highway vehicles where innovation could meet challenging performance requirements or enable electrification.

Areas of work could include:

  • cost reduction at the cell and pack level and in manufacturing
  • increasing energy density of battery cells
  • increasing the power density of battery packs
  • eliminating thermal runaway risks
  • lengthening cell and pack life
  • broadening the temperature ranges for efficient operation of a pack
  • new models to better predict range and battery health
  • recyclability, including second life, design for end of life, reuse, or recycling
  • technologies enabling the efficient design, development, or manufacture of batteries
  • next-generation battery technologies

Competition information for feasibility studies

  • the competition is open, and the deadline for applications is at 11am on 9 December 2020
  • projects can be led by a business of any size working with other businesses or researchers
  • projects could range in size between £100,000 and £1m and last between 3 and 12 months

Competition information for research and development

  • the competition is open, and the deadline for applications is at 11am on 9 December 2020
  • projects can be led by a business of any size working with other businesses or researchers
  • projects could range in size between £300,000 and £1.5m and last between 3 and 12 months (Source: https://www.gov.uk/)

 

11 Oct 20. PHASA-35 successfully completes critical endurance trials with sensor payload. Further to initial flight trials, PHASA-35®, a 35 metre wingspan solar-electric aircraft, has successfully completed critical endurance trials which saw the aircraft operate for 72 hours in a simulated environment that models the harsh stratospheric conditions in which the aircraft is designed to operate.

The trials, a collaborative effort by BAE Systems, Prismatic and the UK’s Defence Science and Technology Laboratory (Dstl), further advances the aircraft’s operational capability.

Known as critical ‘soak’ tests, the trials demonstrated the aircraft working effectively as a fully integrated system together with Dstl’s communications sensor payload; a radio frequency sensing software defined radio that provides a real-time and secure data link.

The trials further validated that the aircraft’s systems are capable of enduring the harsh temperature and pressure extremes experienced in the stratospheric environment.

Exploiting BAE Systems’ capabilities in digital testing and flight systems has enabled the testing to be completed through a series of highly representative ground-based tests, driving pace and reducing costs in the development phase of the programme.

The tests, which were undertaken in a dedicated 40m hangar at Prismatic’s facility near Farnborough, also enabled the team to practice the various operations needed in flight, including the transition from daytime, when the aircraft is powered by the solar array, to night-time, when the aircraft’s batteries are discharged.

Paul Mather, Principal Payload Adviser, Dstl, said: “BAE Systems and Prismatic have put the integration and operation of the user payload at the heart of the PHASA-35 design and it has been very satisfying to work with the team in so clearly showing the benefits of this approach. Dstl has a proud tradition of rapid proving of new technologies which provide military and security advantage, which this latest success reinforces.”

Ian Muldowney, Chief Operating Officer, BAE Systems Air, added: “PHASA-35 is a great example of how we’ve brought together the best in British expertise and partnered to drive technological innovation and deliver critical capability. This latest success, only eight months after PHASA-35’s maiden flight, further demonstrates how UK industry and our partners are accelerating pace to deliver the UK’s vision for innovation, a Future Combat Air System and information advantage.”

Paul Brooks, Managing Director, Prismatic, said: “I am extremely proud of the efforts the team have put into making these trials a success and to do this despite the challenges that a global pandemic has brought to us all. By taking the best from the large company experience that BAE Systems offers, together with the agility of a small, innovative company such as Prismatic, we’ve been able to drive the programme forward with continued pace, culminating in the seamless integration of this first payload. This is an important milestone in bringing PHASA-35 closer to market, working alongside DSTL in the process.”

Further flight trials are due to take place in the coming months and this latest milestone is another step forward for the aircraft which could enter initial operations with customers within 12 months of completion of its flight trials programme.

The PHASA-35 high altitude, long endurance, unmanned aerial vehicle (HALE UAV), successfully completed its first flight in February, less than two years from initial design. The UAV has the potential to maintain flight for up to a year at a time, in the stratosphere, providing military and commercial customers with capabilities not currently available from existing air and space programmes. PHASA-35 has a wide range of potential applications such as the delivery of communications networks, including 5G, as well as support to disaster relief and border protection. Its payload capacity can be adapted to meet the needs of the user to carry sensors such as cameras, thermal imaging and communications equipment.

The aircraft’s long-life battery and highly efficient solar technology allow PHASA-35 to potentially maintain flight for up to a year operating in the stratosphere, the upper regions of the Earth’s atmosphere, and will plug the gap between aircraft and satellite technology.

09 Oct 20. USAF sends software updates to one of its oldest aircraft midair. For the first time, the U.S. Air Force updated the software code on one of its aircraft while it was in flight, the service announced Oct. 7.

And there’s a surprise twist: The aircraft involved wasn’t the “flying computer” F-35, the mysterious B-21 bomber still under development, or any of the Air Force’s newest and most high-tech jets. Instead, the service tested the technology aboard the U-2 spy plane, one of the oldest and most iconic aircraft in the Air Force’s inventory.

On Sept. 22, the U-2 Federal Laboratory successfully updated the software of a U-2 from the 9th Reconnaissance Wing, which was engaged in a training flight near Beale Air Force Base, California, the Air Force said in a news release.

To push the software code from the developer on the ground to the U-2 in flight, the Air Force used Kubernetes, a containerized system that allows users to automate the deployment and management of software applications. The technology was originally created by Google and is currently maintained by the Cloud Native Computing Foundation.

For the demonstration, the U-2 lab employed Kubernetes to “run advanced machine-learning algorithms” to the four flight-certified computers onboard the U-2, modifying the software without negatively affecting the aircraft’s flight or mission systems, the service said.

“The successful combination of the U-2′s legacy computer system with the modern Kubernetes software was a critical milestone for the development of software containerization on existing Air Force weapon systems,” said Nicolas Chaillan, the Air Force’s chief software officer.

During a Sept. 15 interview with C4ISRNET, Chaillan hinted that the service would soon be able to update the software of flying aircraft, calling the capability a “gamechanger” and describing the challenges involved with ensuring the aircraft could be updated without posing a safety risk.

“We need to decouple the flight controls, the [open-mission systems], all the air worthiness piece of the software from the rest of the mission [and] capability of [that] software so we can update those more frequently without disrupting or putting lives at risk when it comes to the flying piece of the jet or the system,” Chaillan said then.

In its news release, the Air Force did not elaborate on the nature of the software update pushed to the U-2 or how it was validated or how the aircraft was modified for the demonstration. A spokesperson for the 9th Reconnaissance Wing did not respond to questions from Defense News by press time.

Col. Heather Fox, 9th RW commander, said the demonstration could pave the way for more experiments using the U-2 as a test bed for agile software development activities.

“The integration of Kubernetes onto the U-2 capitalizes on the aircraft’s high-altitude line of sight and makes it even more survivable in a contested environment,” she said. “We look forward to working with other platforms across the [Department of Defense] to export this incredible capability.”

The 9th RW is the only unit that operates the 33 U-2s owned by the Air Force. Although the aircraft first flew in 1955 — a full 65 years ago — the small cadre of pilots that fly it have sought to preserve its viability, and have taken an unusual direct role in shaping aircraft improvements and upgrades.

Members of the wing’s 99th Reconnaissance Squadron told Defense News in 2017 about efforts to build a stronger relationship with Silicon Valley, with members sometimes working directly with the base’s contracting personnel to buy off-the shelf goods like Garmin watches or tablets, or signing agreements with major tech companies to evaluate new products.

The 9th RW also works closely with the U-2 Federal Laboratory, which was established to develop new software for the U-2 and test it in a safe environment, as well as with operators, software coders and acquisitions professionals. (Source: Defense News)

10 Oct 20. AIP Submarines Could Kill Nuclear Submarines’ Naval Dominance. Nuclear-powered submarines have traditionally held a decisive edge in endurance, stealth and speed over cheaper diesel submarines.

Here’s What You Need To Remember: Who would have guessed nuclear reactors are incredibly expensive? AIP powered submarines have generally cost between $200 and $600m, meaning a country could easily buy three or four medium-sized AIP submarines instead of one nuclear attack submarine.

Nuclear-powered submarines have traditionally held a decisive edge in endurance, stealth and speed over cheaper diesel submarines. However, new Air Independent Propulsion (AIP) technology has significantly narrowed the performance gap on a new generation of submarines that cost a fraction of the price of a nuclear-powered boat.

A conventional submarine’s diesel engine generates electricity which can be used to drive the propeller and power its systems. The problem is that such a combustion engine is inherently quite noisy and runs on air—a commodity in limited supply on an underwater vehicle. Thus, diesel-powered submarines must surface frequently to recharge their batteries.

The first nuclear-powered submarines were brought into service in the 1950s. Nuclear reactors are quieter, don’t consume air, and produce greater power output, allowing nuclear submarines to remain submerged for months instead of days while traveling at higher speeds under water.

These advantages led the U.S. Navy to phase out its diesel boats in favor of an all-nuclear powered submarine fleet.  However, most other navies have retained at least some diesel submarines because of their much lower cost and complexity.

In the 1990s, submarines powered by Air Independent Propulsion (AIP) technology entered operational use. Though the concept dated back to the 19th century and had been tested in a few prototype vessels, it was left to Sweden to deploy the first operational AIP-powered submarine, the Gotland-class, which proved to be stealthy and relatively long enduring. The 60-meter long Gotlands are powered by a Stirling-cycle engine, a heat engine consuming a combination of liquid oxygen and diesel fuel.

Since then, AIP powered-submarines have proliferated across the world using three different types of engines, with nearly 60 operational today in fifteen countries. Around fifty more are on order or being constructed.

China has 15 Stirling-powered Yuan-class Type 039A submarines with 20 more planned, as well as a single large Type 032 missile submarine that can fire ballistic missiles. Japan for her part has eight medium-sized Soryu class submarines that also use Stirling engines, with 15 more planned for or under construction. The Swedes, for their part, have developed four different classes of Stirling-powered submarines.

Germany has also built dozens of AIP powered submarines, most notably the small Type 212 and 214, and has exported them across the globe. The German boats all use electro-catalytic fuel cells, a generally more efficient and quiet technology than the Stirling, though also more complex and expensive. Other countries intending to build fuel-cell powered submarines include Spain (the S-80), India (the Kalvari-class) and Russia (the Lada-class).

Finally, France has designed several subs using closed-cycle steam turbine called MESMA.   Three upgraded Agosta-90b class subs with MESMA engines serve in the Pakistani Navy.

Nuclear vs. AIP: Who Wins?:

Broadly speaking, how do AIP vessels compare in performance to nuclear submarines?  Let’s consider the costs and benefits in terms of stealth, endurance, speed and cost.

Stealth:

Nuclear powered submarines have become very quiet—at least an order of magnitude quieter than a diesel submarine with its engine running.  In fact, nuclear-powered submarines may be unable to detect each other using passive sonar, as evidenced by the 2009 collision of a British and French nuclear ballistic missile submarines, both oblivious to the presence of the other.

However, there’s reason to believe that AIP submarines can, if properly designed, swim underwater even more quietly. The hydraulics in a nuclear reactor produce noise as they pump coolant liquid, while an AIP’s submarine’s engines are virtually silent. Diesel-powered submarines can also approach this level of quietness while running on battery power, but can only do so for a few hours whereas an AIP submarine can keep it up for days.

Diesel and AIP powered submarines have on more than one occasion managed to slip through anti-submarine defenses and sink American aircraft carriers in war games. Of course, such feats have also been performed by nuclear submarines.

Endurance:

Nuclear submarines can operate underwater for three or four months at a time and cross oceans with ease. While some conventional submarines can handle the distance, none have comparable underwater endurance.

AIP submarines have narrowed the gap, however.  While old diesel submarines needed to surface in a matter of hours or a few days at best to recharge batteries, new AIP powered vessels only need to surface every two to four weeks depending on type. (Some sources make the unconfirmed claim that the German Type 214 can even last more than 2 months.) Of course, surfaced submarines, or even those employing a snorkel, are comparatively easy to detect and attack.

Nuclear submarines still have a clear advantage in endurance over AIP boats, particularly on the long-distance patrols.  However, for countries like Japan, Germany and China that mostly operate close to friendly shores, extreme endurance may be a lower priority.

Speed:

Speed remains an undisputed strength of nuclear-powered submarines. U.S. attack submarine may be able to sustain speeds of more than 35 miles per hour while submerged. By comparison, the German Type 214’s maximum submerged speed of 23 miles per hour is typical of AIP submarines.

Obviously, high maximum speed grants advantages in both strategic mobility and tactical agility.  However, it should be kept in mind that even nuclear submarines rarely operate at maximum speed because of the additional noise produced.

On the other hand, an AIP submarine is likely to move at especially slow speeds when cruising sustainably using AIP compared to diesel or nuclear submarines.  For example, a Gotland class submarine is reduced to just 6 miles per hour if it wishes to remain submerged at maximum endurance—which is simply too slow for long distance transits or traveling with surface ships.  Current AIP technology doesn’t produce enough power for higher speeds, and thus most AIP submarines also come with noisy diesel engines as backup.

Cost:

Who would have guessed nuclear reactors are incredibly expensive?  The contemporary U.S. Virginia class attack submarine costs $2.6bn dollars, and the earlier Los Angeles class before it around $2bn in inflation-adjusted dollars.  Mid-life nuclear refueling costs add millions more.

By comparison, AIP powered submarines have generally cost between $200 and $600m, meaning a country could easily buy three or four medium-sized AIP submarines instead of one nuclear attack submarine. Bear in mind, however, that the AIP submarines are mostly small or medium sized vessels with crews of around 30 and 60 respectively, while nuclear submarines are often larger with crews of 100 or more.  They may also have heavier armament, such as Vertical Launch Systems, when compared to most AIP powered vessels.

Nevertheless, a torpedo or missile from a small submarine can hit just as hard as one fired from a large one, and having three times the number of submarine operating in a given stretch of ocean could increase the likelihood chancing upon an important target, and make it easier to overwhelm anti-submarine defenses.

While AIP vessels may not be able to do everything a nuclear submarine can, having a larger fleet of submarines would be very useful in hunting opposing ships and submarines for control of the seas. Nor would it be impossible to deploy larger AIP powered submarines; China has already deployed one, and France is marketing a cheaper AIP-powered version of the Barracuda-class nuclear attack submarine.

It is no surprise that navies that operate largely around coastal waters are turning to cheap AIP submarines, as their disadvantage are not as relevant when friendly ports are close at hand. The trade off in range and endurance is more problematic for the U.S. Navy, which operates across the breadth of the Atlantic, Pacific and Indian Oceans. This may explain why the U.S. Navy has shown little inclination to return to non-nuclear submarines. However, AIP submarines operating from forward bases would represent a very cost-effective and stealthy means to expand the Navy’s sea-control mission.

Sébastien Roblin holds a Master’s Degree in Conflict Resolution from Georgetown University and served as a university instructor for the Peace Corps in China. He has also worked in education, editing, and refugee resettlement in France and the United States. He currently writes on security and military history for War Is Boring. This first appeared in December 2018. (Source: News Now/https://nationalinterest.org)

08 Oct 20. New URC-300™ Radio Now 25kHz and 8.33kHz ETSI Compliant for Global Operation. General Dynamics announced that its new URC-300™ radio has completed European Telecommunications Standards Institute (ETSI) European Standard (EN) 300 676 testing and is now in full compliance with all 25kHz and 8.33kHz VHF specifications. In addition, the URC-300 recently completed the ETSI EN 302 617 UHF testing and is also compliant with 25kHz UHF specifications. The tests were performed by an independent accredited testing service. These certifications make the URC-300 the first portable ruggedized man-pack dual band transceiver to be approved for global operation against the stringent aviation spectrum standards. Orders placed by the U.S. Air Force will begin shipping in December.

Certifications Obtained in Advance

General Dynamics understands that the spectrum approval process can be highly complex and time consuming, especially outside the U.S. As a result, General Dynamics will obtain all required certifications in advance to eliminate purchase, approval and spectrum roadblocks, and help streamline deployment without delays. In addition to the ETSI EN compliance, the URC-300 is also compliant with Radio Equipment Directive (RED), REACH, RoHS, and is certified by the Federal Communication Commission (FCC) for civilian use. Additionally, the U.S. Air Force is sponsoring the JF-12 process which will enable the URC-300 to operate in the U.S. DoD Spectrum.

About the URC-300

The URC-300 is a versatile platform that supports multiple waveforms and provides exceptional (RF) performance to support ground-to-air, line-of-site and other mission critical applications. It provides communications free from interference in highly congested environments and improves immunity to outside interference such as other airfield channels, Wi-Fi transmitters, and commercial FM broadcast towers. Users can operate multiple radios as close as 6.5 ft. apart without interference, an unprecedented capability compared to currently available tactical man-pack radios that require at least 50 to 115 feet of separation. This close proximity capability enables rapid grab-and-go, multi-channel operations during emergency situations.

The radio is specifically designed to enable future features and functions to be added in the field via quick and simple software upgrades. The radio meets MIL-STD-810 requirements for ruggedization and the newly redesigned front panel has a functional display and a simple, intuitive keypad interface that is glove-friendly. The URC-300 is interoperable with its predecessor the URC-200™(V2) radio and many of its accessories.

08 Oct 20. Aitech Systems, a leading provider of rugged boards and system level solutions for military, aerospace and space applications, now offers its C530 GPGPU board with powerful NVIDIA® GPUs, based on NVIDIA’sTuring™ architecture. The rugged, high performance board helps designers overcome major hurdles in the rugged AI landscape by providing accelerated data processing of multiple streams simultaneously, while withstanding extreme environmental conditions.

The enhanced 3U VPX board blends best-in-class NVIDIA technology with Aitech’s powerful ruggedization and SWaP capabilities.  Because it is based on COTS, open-standard architectures, the C530 can be easily utilized in a number of platforms and applications, from military and defense as well as industrial.

The use of multi-layered artificial neural networks and the NVIDIA Turing architecture-based GPUs’ ability to concurrently execute floating point and integer operations give the board a distinct performance advantage over non-GPGPU accelerated architectures.  When equipped with the NVIDIA RTX 3000 GPU, the C530 houses 1920 CUDA® cores for parallel processing, 240 Tensor Cores for AI inference, 30 RT Ray-Tracing Cores for real-time rendering and 6 GB DDR6 for processing of up to 5.3 TFLOPS (FP32). The T1000 GPU’s 896 CUDA cores and 4 GB DDR5 enable 2.6 TFLOPS (FP32) of data processing.

Dan Mor, GPGPU product line manager for Aitech, noted, “There’s a growing need for more advanced platforms that enable higher computation power and relatively low power consumption, while handling many independent video outputs and providing a large throughput back out to the network. Our enhanced C530 uses the power of NVIDIA to provide these performance needs for compute-intensive applications like AI delivery, video analytics and image processing.”

The C530 delivers state-of-the-art accuracy in complex tasks such as object detection, classification, segmentation and motion detection.  It is used widely throughout rugged HPEC applications, including unmanned and autonomous vehicles—both ground and aerial—and in surveillance, targeting and advanced weapons systems found in naval, avionics and industrial environments.

Via an MXM site, the C530 currently supports the NVIDIA RTX 3000 GPU, consuming only 80 W, and the NVIDIA T1000 GPU, consuming only 50 W, with new configurations released as higher-performance MXMs become available.

The C530 operates as a peripheral board with a compatible x86 VPX host SBC, connected to the host SBC over the VPX backplane, via a high speed PCIe Gen3 link of up to 16 lanes. Four independent HDMI video ports each support resolutions of up to 1600×1200 at 60 Hz.

09 Oct 20. University of Queensland partners with US DoD on pain research. The US Department of Defense has provided $1.4m in funding to the University of Queensland to continue research into back pain as part of an international study. Led by the University of Queensland, the three-year project will span Australia and the US and involve some of the world’s most prominent pain researchers.

Dr David Klyne, Fulbright Fellow at the UQ School of Health and Rehabilitation Sciences, said the study aimed to reveal how pain evolved from a brief acute episode to an ongoing, chronic state.

“Over the last few years our research has helped establish that an overly sensitive central nervous system, driven by inflammation, may play an early key role in this acute to chronic transition. This response is likely shaped by the type of tissue initially injured, such as muscle or nerve, and other factors like sleep and physical activity,” Dr Klyne explained.

Lower back pain is the leading cause of disability globally. It is the sixth most costly condition in the US, and it is the leading cause of medical discharge in the US military.

“Beyond advancing our understanding of the physiology of pain, the project is designed to identify new risk factors for developing chronic pain, particularly back pain, which can be targeted by interventions.

Dr Klyne added, “Understanding how sleep and exercise might be used to target these and other risk factors can help prevent the transition from acute to chronic pain.”

In Australia, up to 80 per cent of the population experience back pain, with direct healthcare costs of $4.8bn per year.

Dr Klyne said the results of the study would have the potential to quickly translate from research into practice.

“The clinical applicability of these results would be immediate, both in terms of new and refined non-pharmacological treatments for pain as well as influencing advice given by practitioners and strategies advised for self-management,” he explained. (Source: Defence Connect)

————————————————————————-

Oxley Group Ltd

Oxley specialises in the design and manufacture of advanced electronic and electro-optic components and systems for air, land and sea applications within the military sector. Established in 1942, Oxley has manufacturing facilities in the UK and USA and enjoys representation worldwide.  The company’s products include night vision and LED lighting, data capture systems and electronic components. Oxley has pioneered the development of night vision compatible lighting.  It offers a total package incorporating optical filters, equipment modification, cockpit and external lighting along with fleet wide upgrade services including engineering, installation, support, maintenance and training. The company’s long experience of manufacturing night vision lighting and LED indicators, coupled with advances in LED technology, has enabled it to develop LED solutions to replace incandescent and fluorescent lighting in existing applications as well as becoming the lighting option of choice in new applications such as portable military hospitals, UAV control stations and communication shelters.

———————————————————————-

Back to article list