Sponsored By Oxley Developments
13 Feb 19. DOD wants to ‘change the way we do software development.’ The Defense Department’s acquisition branch is ready to blow up some long-standing traditions to make the organization more digital ready, starting with the DOD 5000 series acquisition regulations and the Defense Acquisition University. Because almost every weapons system will be software intensive, the DOD 5000 must also become software development friendly and allow for rapid and continuous updating, Assistant Secretary for Defense Acquisition Kevin Fahey said during a keynote presentation at the National Defense Industry Association’s Section 809 Panel event Feb. 13.
“We’re going to start from scratch. And we’re going to try to lose the 5000 mentality,” of having a set of requirements that need to be tailored for a program, Fahey said. But the chief focus will be on software development.
“We are absolutely in a situation where we are trying to change the way we do software development,” he said, adding that making the DOD 5000 series digital would be a challenge.
The Section 809 Panel, which is tasked by Congress improve DOD’s technology procurement practices, also proposed changes to DOD 5000 in its January report, regarding best practices for portfolio management. Fahey said the idea would be to include the panel’s recommendations while also overhauling other sections.
The Defense Acquisition University is also up for reform, moving to conducting actual job training rather than acting as a certification authority.
“We’ve lost the bubble a little bit on [making] sure that we’re qualified to do our job more so than certified to do our job. And a lot of it is … you learn from the hard knocks, and … you get back to the good behaviors,” Fahey said.
In the last several years, the workforce has become more risk averse. The idea is to challenge existing structure with ongoing pilots on other transaction authorities and agile software development that would entail a shorter possibly three-day training class rather than a 10-week certification course, Fahey said.
The Defense Department is also considering a cyber scoring process that would certify contracting companies and enhance supply chain security.
Fahey said the process would function similarly to a credit score, and vendors would be required to have a certain score to work on certain missions, such as nuclear defense. DOD’s Acquisition and Sustainment Office is currently in talks with defense industry organizations to develop standards on which to base a cyber score.
“We’re already behind,” Fahey said, noting that the financial and health sectors already have similar systems.
While there isn’t a timeline on when this function would be implemented, Fahey said a plan would be cemented this year, but the challenge will be implementing it across the industry base.(Source: Defense Systems)
15 Feb 19. Polish MND postpones Rosomak Battle Management System programme. The Polish Ministry of National Defence (MND) is essentially postponing its Rosomak Battle Management System (BMS) programme, defence sources associated with the effort have told Jane’s. The Rosomak BMS programme, sources said, will be reverted to a research and development (R&D) effort after it was concluded by the MND’s Armament Inspectorate that the current offering did not meet expectations. According to the sources, one of the MND’s concerns related to encryption levels of the BMS solution. Neither the MND nor Polska Grupa Zbrojeniowa (PGZ) was available to comment on the current state of the programme. A contract was expected to be signed in January 2019 to equip an initial tranche of 14 Rosomak 8×8 armoured personnel carriers (APCs) with the BMS capability following nearly two years of negotiations between the MND’s Armament Inspectorate and Polish state-run company PGZ, which acted as systems integrator in the programme. (Source: IHS Jane’s)
14 Feb 19. Crystal Group, Inc., a leading designer/manufacturer of rugged computer hardware, is expanding its Rugged Embedded (RE) product line with the RE1529 Rugged Embedded Computer, debuting in booth #1635 at the premier naval conference and exhibition, WEST 2019, held February 13 – 15 in San Diego, California. The reliable, feature-rich, configurable Crystal Group RE1529 is engineered with the latest commercial off-the-shelf (COTS) technologies, including Intel® chipsets and processors, stabilized in a compact, rugged enclosure to provide robust compute power over a long operational life in even the most extreme environments. Manufactured with light weight composites and delivered with Xeon D multi-core processors, up to 128GB of ECC DDR4 RAM, nine internal 7/9mm SATA solid-state drive (SSD) bays, and flexible I/O in a rugged package measuring just 6 x 15.4 x 9.6 inches and weighing 7.5 pounds (3.4 kg). The RE1529 provides an incredibly flexible server class architecture that also accommodates customers’ specific input/output and third-party card requirements. From the ground up, this product has been designed to survive extreme vibration and environmental conditions which meet or exceed all military requirements. This ultra-compact package has been completely designed and manufactured in the USA and comes with a 5-year warranty.
“Providing server-class performance in a compact, embedded footprint, our latest Rugged Embedded Computer, the RE1529, couples a sixteen-core Intel Xeon D processor with data storage capacity in a fully customizable, scalable system to meet both current and future requirements,” says Crystal Group EVP of Engineering, Jim Shaw. “The RE1529 is designed and tested for SWaP sensitive applications including airborne and transit case use with a minimal foot print, wide range voltage inputs. The system is virtually maintenance free with top performance in harsh conditions including temperature extremes, high shock and vibration, high humidity, high altitudes, and dust.
The Crystal Group RE1529 features high-quality components in a rugged enclosure made from carbon fiber composites and billet milled, strain-hardened aluminum. Crystal Group engineers its rugged and reliable RE1529 to withstand challenging applications, harsh conditions, and extreme environments, including vibration-intense helicopter applications. The RE1529 is designed and tested to meet or exceed MIL-STD-810 and comply with Export Control Classification Number (ECCN) 5A992. Crystal Group taps 30 years of experience engineering high-performance, rugged computer and electronics hardware to bring impressive compute power and data handling to some of the harshest environments of energy, military, aerospace, government, industrial, and commercial applications.
14 Feb 19. LDRA, software quality experts in the areas of functional safety, security, and standards compliance, has partnered with Jama Software to deliver a test validation and verification lifecycle solution for safety- and security-critical applications in the embedded software market. The LDRA tool suite integrates with Jama Connect enabling developers to bidirectionally trace the relationships between the various stages of application development. Such transparency across the development lifecycle ensures faster development, improved code quality, and stronger risk mitigation.
For applications to be safe and secure, development teams must follow strict processes that ensure all aspects of development are fully traced, tested, and documented and comply with industry safety and security standards such as DO-178B/C for aerospace, ISO 26262 for automotive, IEC 61508 for industrial controls, and IEC 62304 for medical devices. To accomplish this, developers must map application requirements to software written, and that code must be tested to ensure that it does what it is expected to do, that all existing code maps to a given requirement, and that all interrelationships are bidirectionally traceable.
By integrating the LDRA tool suite with Jama Connect, LDRA closes the software verification gap by showing the relationships among application requirements, code, software verification, and documentation. LDRA automates software analysis and application verification to ensure software quality, while Jama Software provides a product development platform with broader visibility into product definition and design, including build and test phases. The Jama Connect management framework exposes the relationships and dependencies among projects, teams, activities, and results. Developers can quickly trace requirements, source code, test cases, and test execution results to gain insight into testing progress, code quality, stability of the system, and validation on the host and target platforms.
Thanks to this integration, developers can analyze the application by exporting Jama Connect project artifacts or item types into LDRA’s TBmanager to identify any feedback loops on software design, certification needs, testing, or code defects. The item types are typically Jama Connect multilevel requirements and test cases, which can be kept synchronized no matter which development lifecycle methodology has been adopted, enabling teams to work with the most current records. Also, progress and evidence reports can be automatically generated at any time. This comprehensive integration greatly increases efficiency in software testing and reduces costs of producing high-quality software including software that must go through qualification or certification.
“LDRA and Jama Software have been focused on giving developers the tools they need to build safe and secure systems for many years,” said Ian Hennell, Operations Director, LDRA. “Our visibility into code quality and compliance coupled with Jama’s focus on requirements and process enables us to help developers work smarter and faster as they create safe and secure code. Companies using our combined solution can verify regulatory compliance while reducing the cost of testing—no small feat as compliance costs spiral for many.”
“Integrating with the LDRA tool suite just made sense given our complementary focus,” said Scott Rogers, VP Business Development, Jama Software. “Software complexity, particularly in the automotive industry, has created unprecedented application risks for OEMs and their suppliers. Partnering with LDRA ensures that we can provide our clients with an end-to-end safety and security compliance solution that will reduce development time, improve software quality, and mitigate risk.”
13 Feb 19. LCR Embedded Systems Announces Family of MIL-STD 3-9 Slot VPX Enclosures, Perfect for High-Performance Applications, SWaP-C. Top/Sidewall and Rear Panels Heat Exchangers, Forced Air, Heat Pipes, Improve Thermal Performance, Extremely Customizable. LCR Embedded Systems is pleased to announce the availability of a new family of rugged, highly configurable 3U VPX chassis designed for compact aerospace and UAV applications requiring state-of-the-art power dissipation. This family of chassis is ideal for advanced military applications with significant SWaP-C considerations that must operate in hostile environments and require a great deal of processing power.
The enclosures have been qualified to MIL-STD-810 environmental and MIL-STD-461 electromagnetic compatibility requirements, making them an ideal solution for applications that demand significant processing power in the most punishing environments. The heat dissipation needed by high-performance cards is made possible by a variety of options including: hollow enclosure walls with heat exchangers, forced air, and heat pipes. Also available are completely sealed enclosures with internal fans that circulate air within the card cage, preventing “hot spots” and enabling the enclosures to function in highly contaminated environments.
Three, 5, 6, 7, and 9-Slot versions are available, and all enclosures share the same general architecture, power supply, and front/rear panels, providing increased flexibility and a path to cost-effective upgrades to match changing application demands.
“VPX was created to bring high performance to harsh military environments, but performance means heat dissipation – and the most powerful 3U VPX cards are only getting hotter,” states LCR Embedded Systems President Dan Manoukian. “SWaP has already become SWaP-C to include cooling, and our new enclosure family more than satisfies SWaP-C requirements in addition to offering the ruggedness that our customers’ applications demand. Perhaps,” he adds, “we should start referring to SWaPR-C instead.” CompactPCI and VME versions are also available for all chassis. (Source: BUSINESS WIRE)
12 Feb 19. Is the new strategy on artificial intelligence to play catch up? Artificial intelligence is code entrusted to reason through independent choices. The decisions themselves are the result of coded paths and inputs, feeding data into different parts of algorithms, weighting outcomes, and then creating an end product that is designed to be useful to the humans that consume it. There are degrees and worlds in AI, a vast space from deterministic and emergent behavior, from online machine learning to targeting tools, and it is in that complexity that care is most required, that human direction is most desired, that the fears and possibilities of generations of science-fiction authors await to be realized. It is a complexity, in form and function, that is lacking from both the White House’s new AI strategy. For all intents and purposes, the document guiding the government’s approach to artificial intelligence might as well say “fancy software” and be done with it.
In the “Executive Order on Maintaining American Leadership in Artificial Intelligence,” the White House states that its efforts in AI are guided by five principles. Those principles are, roughly: driving technological breakthroughs, development of technical standards, training workers with new skills for an AI economy, balancing trust in AI and protection of civil liberties, and supporting “an international environment that supports American AI research and innovation and opens markets for American AI industries.”
Nowhere is the how or the what of AI spelled out. There is AI investment throughout the federal government, and the White House wants to make sure it continues in a permissive environment, but the term is an umbrella, a catch-all, with no specificity as to what it does, or why it might require efforts to train new workers.
“I applaud a number of aspects of the Executive Order, such as the proposal – mirroring the white paper I released last summer – to open federal data-sets to non-federal entities” said Sen. Mark R. Warner (D-VA), vice chairman of the Senate Select Committee on Intelligence. “Overall, however, the tone of this Executive Order reflects a laissez-faire approach to AI development that I worry will have the U.S. repeating the mistakes it has made in treating digital technologies as inherently positive forces, with insufficient consideration paid to their misapplication.”
Warner’s white paper lightly describes AI, highlighting the use of machine learning to train algorithms on pattern recognition in images, for example. But overall the paper is focused not on the risks of how AI is coded, but on the dangers that could come from a small segment of the technology industry capturing and dominating an emerging market.
If there’s an obvious precedent to all this, it’s rooted in the early 1990s. A confluence of factors, including a push from the Clinton administration, proactive legislation in Congress, and favorable rulings from the courts created the conditions that would foster the Silicon Valley-led tech industry in its current form. Parallel to that initiative was getting the government itself to adopt the then-young technologies of the information age, all under the aegis of “reinventing government.”
While the White House is light on the specifics of what it wants AI to do, or even what exactly AI is, the Pentagon has a far more workable definition.
“AI refers to the ability of machines to perform tasks that normally require human intelligence – for example, recognizing patterns, learning from experience, drawing conclusions, making predictions, or taking action – whether digitally or as the smart software behind autonomous physical systems,” reads the Department of Defense AI strategy summary.
It helps that different parts of the Pentagon have been defining and working on AI in different ways. In June 2018, the Pentagon stood up the Joint Artificial Intelligence Center, or JAIC, and the military is already involved in the development and acquisition of specific AI projects. Project Maven, likely the most famous of these, was contracted through Google to adapt open-source tools to identify objects in drone videos.
“The impact of artificial intelligence will extend across the entire department, spanning from operations and training to recruiting and healthcare,” said Dana Deasy, the Defense Department’s chief information officer. “The speed and agility with which we will deliver AI capabilities to the warfighter has the potential to change the character of warfare. We must accelerate the adoption of AI-enabled capabilities to strengthen our military, improve effectiveness and efficiency, and enhance the security of our nation.”
The overall approach is focused on AI-enabled capabilities, using a shared and scalable foundation for AI, training an AI workforce, working with business, academia, and allies, and leading the world in “military ethics and AI safety.” That last section is likely to attract standalone significance, especially since some companies have already made their future work with the DoD contingent to how it responds to questions of AI ethics
“We will invest in the research and development of AI systems that are resilient, robust, reliable, and secure; we will continue to fund research into techniques that produce more explainable AI; and we will pioneer approaches for AI test, evaluation, verification, and validation,” the document reads. “We will also seek opportunities to use AI to reduce unintentional harm and collateral damage via increased situational awareness and enhanced decision support.”
The strategy is a promise to develop principles more than an outline of principles themselves. Of particular note is the notion of AI as a tool explicitly for the reduction of harm and collateral damage. That same language is often used in the explanations for the use of precision weapons, though the circumstances guiding the development of precision weapons were all about battlefield utility first. AI that can reduce collateral damage is also likely AI that can be part of autonomous targeting. To its credit, the DoD strategy specifically acknowledges the risk that might come from “’emergent effects’ that arise when two or more systems interact, as will often be the case when introducing AI to military contexts.”
Borrowing and adapting innovations from the commercial space for tasks such as predictive maintenance and supply could be part of a quiet logistics revolution within the military. Warehouses adequately stocked with exactly what is needed or parts moved to the bases and troops before shortages occur could sustain operations at reduced cost, as imagined by the strategy, or at a higher tempo, as is possible when commanders adjust to the new normal.
Indeed, one of the better case scenarios for the military and the government’s adoption of more AI technologies is simply adapting what has already worked in the private sector and with contractors to new missions. If there is funding and effort behind the new AI plans, it could realize many of these advantages first in cybersecurity.
“Today, AI is at the center of most major technological advances in areas as varied as cybersecurity, self-driving cars or development of cancer treatments,” says Dmitri Alperovitch, the co-founder and chief technical officer of CrowdStrike. “In cybersecurity, for example, these technological advances include enabling the defenders to recognize and stop never-before-seen attacks and being faster in detecting and responding to attacks.”
How, exactly, the United States fosters, adopts, and uses the tools powered and enabled by AI is likely of major importance for decades to come. The existence of national strategies for AI suggest that the government at least knows it needs to take AI seriously. Both the DoD and the White House seem content to follow the lead of the private sector, where there is a tremendous amount of work being done on everything from warehouse management to automated surveillance systems and innovative cyber security techniques. Yet if there is any overall ethos guiding the strategies, it is a need to meet the technology as it already is. Much of the government’s AI strategy is about playing catch-up. (Source: C4ISR & Networks)
12 Feb 19. New US Strategy Outlines Path Forward for Artificial Intelligence. The Department of Defense on Feb. 12 released the summary of its strategy on artificial intelligence. The strategy, Harnessing AI to Advance Our Security and Prosperity, outlines how DOD will leverage AI into the future. Key tenets of the strategy are accelerating the delivery and adoption of AI; establishing a common foundation for scaling AI’s impact across DOD and enabling decentralized development and experimentation; evolving partnerships with industry, academia, allies and partners; cultivating a leading AI workforce; and leading in military AI ethics and safety.
The department’s strategic approach to AI emphasizes its rapid, iterative, and responsible delivery and then the use of lessons learned to create repeatable and scalable processes and systems that will improve functions and missions across the department. AI is poised to change the character of the future battlefield and the pace of threats faced in today’s security environment. The United States, together with its allies and partners, must adopt AI to maintain its strategic position and prevail on future battlefields.
AI will impact every corner of the department, spanning operations, training, sustainment, force protection, recruiting, healthcare and others.
The focal point of DOD AI is the Joint Artificial Intelligence Center, established last June under DOD Chief Information Officer Dana Deasy and led by Lt. Gen. John “Jack” Shanahan, to provide a common vision, mission and focus to drive department-wide AI capability delivery.
DOD’s AI strategy supports the National Defense Strategy and is part of DOD’s overall efforts to modernize information technology to support the warfighter, defend against cyber attacks and leverage emerging technologies.
12 Feb 19. Flight and maintenance logs, custom checklists and reports from the popular flight log website DroneLogbook are now conveniently accessible to all UgCS users. The Telemetry Sync tool is a response to demand from drone operators flying with UgCS to be able to store their telemetry data not only locally on the computer, but as well in the cloud. In addition to providing operators with comprehensive flight logging, DroneLogbook.com also allows users to manage maintenance logs and generate custom checklists.
DroneLogbook is also useful in ensuring compliance with the law: it allows users to check the status of controlled airspace before planning a drone mission and generate reports in accordance with the requirements of aviation authorities such as the FAA, CAA, CASA and CAD. The recently updated Telemetry player tool of UgCS records all flight telemetry data, enabling to replay and analyse each drone’s flight path and the videos recorded.
The free of charge add-on synchronisation feature is an effective tool both for individual drone operators and for enterprises with large drone fleets, as it allows to store all data in one place, even if multiple laptops are being used for drone mission control.
Download the UgCS Telemetry Sync tool on: http://ugcs.com/DroneLogbook
11 Feb 19. India’s DRDO’s ‘Dare to Dream’ contest. The Defence Research and Development Organisation (DRDO) has launched ‘Dare to Dream’, a contest to encourage startups and individuals to come up with innovative defence and aerospace technologies. Applicants are invited to send innovative, workable proposals that can impact various related domains. The winning entries, which should specify the plan of executing it into a prototype, stand to get one of five prizes ranging from ₹3 lakh to ₹10 lakh in two categories.
The military R&D organisation has asked for solutions in the areas of Artificial Intelligence, Autonomous Systems, Cybersecurity, Hypersonic Technologies, Smart Materials, Quantum Computing, and Soldier as a System. DRDO labs are already working in these areas, an official said.
Last April, the Ministry of Defence initiated a wider plan — iDEX or Innovation for Defence Excellence — aimed at financially supporting innovators from among small- and medium-sized industries, individuals, startups, and institutes. (Source: Google/https://www.thehindu.com)
12 Feb 19. US Army Uses New “Macbook”-Sized Tablet to Operate Multiple-Small Drones. The US Army is refining new small drone combat tactics to accommodate emerging technologies such as AI-enabled command and control, higher resolution sensors, faster computer processing, multi-drone control “tablets” and streamlined multi-drone interoperability. An Army-developed “Macbook”-size computer tablet enabling ground soldiers to control multiple drones on a single system, for instance, is nearing formal production — inspiring a tactical shift toward new drone attack strategies.
Common technical standards, established protocol, smaller form factors and fast-improving processing speeds are enabling the Army to engineer this new hand-held device that can manage flight path, operations and sensor payloads for multiple drones on one system.
The effort is aligned with the Army’s now-in-effect Small UAS Roadmap, which calls for increased interoperability, autonomy and command and control to expand the range and effectiveness of small attack drones.
The introduction of a common operational infrastructure backbone for a range of different sensors introduces a sphere of new tactics, techniques and procedures (TTPs) for the Army, as soldiers can readily switch from one sensor view to another in a fast-moving combat environment.
The Army owns the technical blueprint for the system, called a technical data package, and has been working with industry firms such as MAG Aerospace to engineer the new controller. Service plans call for the controller to reach formal production within the next year. MAG engineers have been supporting the Army by working on prototypes of the new controller and providing “initial qualification training” and logistical support.
The new controller can operate a range of small drones to include the one-pound Wasp Micro Air Vehicle, four-pound Raven drone and 13-pound Puma, among others.
MAG developers explained that the common controller could, among other things, perform squad-level reconnaissance in high-risk areas.
“We could put an asset above a building to do a quick check, before fixed wing or multi-rotorcraft attack a target objective,” Pat Wells, MAG Program Manager for Small UAS, told Warrior Maven in an interview.
The new system is engineered to enable faster exchange between sensor applications, and therefore generate improved sensor-to-shooter time for nearby ground units. There may also be combat circumstances wherein drone operators need to quickly shift from an Electro-Optical Sensor to Infrared thermal imaging should there be a need to track a heat signature. Infrared imaging can be of particular value in areas where there may be obscurants or weather conditions challenging standard electro-optical detection.
Drawing upon faster processing speeds and improved levels of autonomy, a technically enhanced common controller could help drones share information with one another, all while networking with soldiers on the ground. For instance, if one small drone is tracking the course of an enemy armored vehicle, which then disappears into a wooded area on the other side of a ridge, a second nearby drone might be cued to use infrared sensors to track the heat signature coming from the engine of the otherwise undetectable vehicle operating beneath the trees.
A 13-pound hand-launched Puma, for example, can operate its gimbaled camera up to ranges of 500-feet in the air; should an enemy target or object of interest travel over a ridge at an altitude higher than 500-feet, a ground-based operator might wish to quickly switch from one small drone to another in better position to track the target – a tactic which could be expedited by a new common controller enabling fast exchange between drones.
“This controller is intended for small UAS systems. The Army has a roadmap for small UAS. Their intent is that whatever system they acquire will likely be operated by this common controller,” Wells added.
The multi-faceted effort is consistent with the services’ ongoing Tactical Unmanned Aircraft Systems research and development initiative aimed at quickly harvesting and integrating new small drone technologies.
Phase 1 of the effort, according to an Army statement, is focused on five main areas to include “take-off and landing, performance trade-offs, autonomy, all weather sensors and drone teaming.” Phase 2 will follow with extensive trade studies and analysis, according to an Army report.
More seamless integration when it comes to operating small drones is being accelerated by major advances in autonomy, computer processing and applications of AI. A 2017 essay in the “United States Army Aviation Digest” points out that small drones, empowered by increased automation and AI, could function as function as defenses to incoming missiles. In this scenario, having an ability to transition from one drone, and drone payload, to another could greatly improve a commander’s decision cycle. Certainly when it comes to small drones, AI enhanced processing could easily function defensively by identifying the location and nature of incoming enemy attacks – and possibly cue interceptor systems.
“Without an evolved operating concept for unmanned aircraft systems (UAS) that includes the role of artificial intelligence (AI) and autonomous weapon systems, the Army not only risks stumbling its way into its most capable future weapon system, but also potentially the future’s most powerful technology,” the essay, written by Col. Robert Ault, states.
As part of its evolving long-range drone plan, the Army is also making progress with a new system giving in-flight helicopter crews an ability to view multiple drone feeds simultaneously, Army officials told Warrior.
The move to engineer common standards and interfaces, such as those which enable the new controller, is also informing a new Army system called Supervisory Controller for Optimal Role Allocation for Cueing of Human Operators, or SCORCH. Army officials say SCORCH enables a single operator to control up to three drones at the same time.
The most recent operational manned-unmanned teaming technology, called Level 4 LOI (Level of Interoperability 4), has been used with great success in Afghanistan by the 1-229th Attack Reconnaissance Battalion. Level 4 LOI MUM-T (manned-unmanned teaming) enables AH-64E Apaches and OH-58 Kiowas to control the flight path and sensor payload of Army Shadow and Gray Eagle drones.
MUM-T uses Tactical Common Data Link Assembly for the AH-64E, providing fully integrated ranges exceeding 50 km.
An Army official working on drone technological development told Warrior “MUM-T Operations are made possible by the introduction of a standardized interoperability protocol supporting video/data transmissions between ground-manned-unmanned platforms.” (Source: UAS VISION/Warrior Maven)
11 Feb 19. BAE Systems has invited businesses from across the UK to showcase their hi-tech and innovative ideas to help resolve real world complex issues. The challenge resulted in a Dragons’ Den style event being staged in Portsmouth by the company’s Maritime Services business. A total of 13 businesses, large and small, from as far afield as Scotland, were invited to pitch their innovations. Two of the companies, Intrepid Minds of Warwickshire and Aralia Systems of West Sussex, have been selected to discuss potential opportunities for collaboration to address some of the technical challenges faced by BAE Systems.
BAE Systems Technologist, Matt Albans, explained that the challenges were issued anonymously through external agency the Knowledge Transfer Network, which specialises in enabling innovation sharing between businesses.
Only when the companies were selected were they told they’d be presenting their ideas to one of the largest defence companies in the world.
The participants faced a panel of BAE Systems experts and had 20 minutes to make their pitch before a question and answer session.
Matt, who co-ordinated the event, said the companies ranged from one person start-ups to a multi-national defence company – “and everything in between”.
The challenges involved using artificial intelligence to recognise specific components in complex pieces of machinery and the development of state-of-the-art techniques to monitor the rates of degradation of metal and painted surfaces.
“Here in Portsmouth we’re extremely proud of the fact that the work we do enables the Royal Navy to deploy the fleet to protect our shores and those of our NATO allies around the world,” said Matt.
“There are some really clever ideas out there and as a large company we’re always looking for exciting new opportunities to collaborate with industry partners and the supply chain to develop the best possible solutions for our customers.
“In this way we can develop even more innovative ways of helping our armed forces to stay a step ahead to protect people, national security, critical infrastructure and vital information.”
The event also highlighted possible opportunities for some of the companies which presented to collaborate with different areas of the business. These opportunities are now being investigated further.
“A lot of people said it was the first time they’d been able to pitch their ideas to a company the size of BAE Systems and they really appreciated the opportunity.”
Adam Smith, Managing Director of Intrepid Minds, which manufactures state-of-the-art drone data analytic systems for a broad range of applications, said: “BAE Systems could be seen as one of those big, prime companies that people like me from small and medium enterprises respect but are wary of. That’s not the case at all and opportunities like this make you appreciate just how much we can do for one another.
“Intrepid Minds, which I set up eight years ago, now employs 25 people. We already have a really good track record of innovation in a number of areas, including the defence sector, so we’re hopeful that this event will open even more doors for us.”
Due to the unprecedented success of the event, further sessions are being planned throughout 2019, with the additional involvement of BAE Systems’ Air and Submarines businesses.
07 Feb 19. Algorithmic Warfare: AI Project to Link Military, Silicon Valley. It was almost a year ago when Google employees made waves from Silicon Valley to Washington, D.C., by signing a letter objecting to the company’s work with the Defense Department’s Project Maven. The effort — to develop AI systems capable of analyzing reams of full-motion video data collected by drones that would then tip off human analysts when people and events of interest pop up — was viewed by the employees as Google being in the business of war. Eventually, the company chose to not pursue another Project Maven contract.
But the brouhaha may have been mitigated if the Pentagon and Silicon Valley knew how to better communicate, said Paul Scharre, director of the Center for a New American Security’s technology and national security program.
“There’s not a lot of crosstalk and crosspollination between these communities — between policymakers and those in the AI community who are concerned,” said Scharre, who is also the author of the book Army of None: Autonomous Weapons and the Future of War.
To try and bridge the gap, CNAS is spearheading a new effort — known as the Project on Artificial Intelligence and International Stability — to create more dialogue between policymakers, the developers of AI platforms and national security experts working outside of government.
“You need perspectives from all three to really grapple with … [these issues] effectively,” he said.
The project will focus not only on the use of military applications for AI, but also on how other countries are employing the technology, he said.
“Countries around the globe [are] making very clear their intent to harness artificial intelligence to make their countries … stronger, to increase national and economic competitiveness,” he said.
“We’ve seen well over a dozen countries now launch some form of national strategy for artificial intelligence.”
China and Russia have trumpeted the fact that they intend to make AI a cornerstone of their future scientific pursuits and become world leaders in the technology.
One of the main purposes of the project is to better understand the risks associated with nations beginning to invest in AI technology and the security dilemmas that the United States should be concerned about. It will bring together three communities — policymakers, those in the AI community and security studies scholars — to better understand the problem, he said.
Scharre, who has researched artificial intelligence for years, said in the course of the work he has done, he has seen a lack of communication between different stakeholders.
“I’ll talk to policymakers working on AI issues. I’ll talk to AI researchers who are concerned about these issues surrounding competition. I’ll talk to people in academics or the security studies field who are interested in artificial intelligence,” he said. “But I’m not sure that these communities are talking to each other enough. … To us the big value here is bringing these communities together and convening them.”
Michael Horowitz, an adjunct senior fellow at CNAS and a political science professor at the University of Pennsylvania focusing on military innovation, said rapid advances in artificial intelligence make it an important area of study.
“If you think about AI as something akin to electricity or the combustion engine, then understanding the way that it shapes international politics … becomes really complicated and it takes a lot of work … from a lot of different perspectives,” he said. “[We want] to convene some genuine discussions even among people that might disagree about some things and use that as a basis to … generate intellectual progress.”
The heart of the effort will be four workshops held throughout the country to bring stakeholders together, Scharre said. The first meeting will take place this spring in Washington, D.C., but others will be held in San Francisco, Philadelphia and London.
“We’re going to try and get out of Washington to engage these different communities because that’s a convenient place for policymakers, but it’s not necessarily a hub for these other communities,” he said.
These will be closed-door events operating under the Chatham House rule, he noted. There will also be a number of smaller, public events, he added.
Horowitz said: “Open events are great for trying to get the word out and expose the broader community to some of … the issues and challenges surrounding AI. Closed sessions can have value as well because sometimes people are more willing to be open and honest in a more private discussion.”
While not yet finalized, the meetings will focus on a variety of topics, such as AI safety, he said. There is potential for accidents to occur with the technology, he noted. These include hacking, spoofing and even the risk of a human applying an algorithm outside of the context for which it was designed.
“All of those generate safety and reliability challenges that … [need to be] understood analytically and managed practically,” he said.
CNAS plans to capture some of the insights gleaned from the workshops and distribute them via more “native venues” for some of the communities the think tank is targeting, Scharre said.
“If we want to talk to security studies scholars and tech folks and policymakers, they read different things and they have different kinds of conversations,” he said. “We will do some think tank reports that dig in deep into these topics, but also try to catalyze the conversation in some of these other kinds of venues [like op-eds] to engage these audiences in this broader discussion.”
CNAS has grown its team to staff the project, including a number of full-time and adjunct positions, Scharre said. The effort is being supported by a grant from the Carnegie Corp. of New York. The think tank announced the campaign in December and it will run through October 2020. ND. (Source: glstrade.com/National Defense)
Oxley Group Ltd
Oxley specialises in the design and manufacture of advanced electronic and electro-optic components and systems for air, land and sea applications within the military sector. Established in 1942, Oxley has manufacturing facilities in the UK and USA and enjoys representation worldwide. The company’s products include night vision and LED lighting, data capture systems and electronic components. Oxley has pioneered the development of night vision compatible lighting. It offers a total package incorporating optical filters, equipment modification, cockpit and external lighting along with fleet wide upgrade services including engineering, installation, support, maintenance and training. The company’s long experience of manufacturing night vision lighting and LED indicators, coupled with advances in LED technology, has enabled it to develop LED solutions to replace incandescent and fluorescent lighting in existing applications as well as becoming the lighting option of choice in new applications such as portable military hospitals, UAV control stations and communication shelters.