CSIR NAL flies test-drone that can double up as pseudo satellite (GS Paper 3, Science and Technology)
Why in news?
- Scientists at the CSIR-National Aerospace Laboratories (NAL) successfully tested an unmanned aerial vehicle, called High Altitude Pseudo Satellite (HAPS), at Challakere, Karnataka.
- HAPs are like drones, except that they are expected to be in the stratosphere and can be powered enough by solar cells and a battery-system to be able to hover for days on end.
Applications:
- A fully working HAPS can be used for a variety of applications, from surveillance to beaming 5G waves.
- They can double up as “towers in the sky” and have more flexibility than satellites, in being able to map a piece of land from above. The NAL system is currently far from these goals.
- The HAPS that was flown is a scaled-down HAPS. The 5-metre-long system, with a wingspan of 11 metres and weighing 23 kg, rose to about 3 km and stayed put for about eight hours. This prototype met or exceeded all the performance metrics set out for it.
- However, a series of tests have been planned and they are expected to culminate in a full-bodied craft; with a wingspan of 30 m (nearly as much as a Boeing 737) by 2027. It will be able to rise to 23 km and stay airborne for at least 90 days.
Challenges:
- They are unmanned and must fly days and nights on end, meaning they need solar-powered batteries capable of lifting the airframe up to the stratosphere (which extends from 20-50 km above the earth’s surface).
- This ascent into the sky is challenging, given the air turbulence in the atmosphere, and the fact that these are relatively light planes.
- Unlike the familiar solar panels on rooftops, those used to power the plane are extremely thin solar films. There are only one or two companies in the world capable of making solar-cell films that thin.
What’s next?
- CSIR-NAL aims to design and build the HAPS’ propellers, battery management system, carbon-composite airframe, flight-control system, and the high-powered electric motors that can withstand extreme temperature ranges.
How can child safety be ensured online?
(GS Paper 2, Governance)
Context:
- In early February, Meta CEO Mark Zuckerberg provided a public apology to parents whose children were victims of online predators during a Congressional hearing, that could be described as hostile to not just Meta, but other tech majors including X, TikTok, Snapchat, and Discord.
- The Big Tech and the Online Child Sexual Exploitation Crisis hearing was reportedly called “to examine and investigate the plague of online child sexual exploitation” and all their executives were pinned on their abdication of responsibility to protect children on social media platforms.
What are the issues with children’s safety online?
- Tech majors are increasingly finding themselves in the midst of a maelstrom of protests across the world, not just over privacy concerns, but also with the security of users online.
- Across the world, parents and activists are aggressively advancing the agenda of having the tech companies take responsibility, or provide platforms that are ‘safe by design’ for children and young users.
- A UNICEF report of 2023, ‘The Metaverse, Extended Reality and Children’, attempted an analysis of how virtual environments may evolve and how they are likely to influence children and young adults. These technologies do offer many potential benefits for children, such as in the areas of education and health.
What are the potential risks?
- The potential risks to children are significant. These include safety concerns such as exposure to graphic sexual content, bullying, sexual harassment and abuse, which in immersive virtual environments can feel more ‘real’ than on current platforms.”
- Further, vast amounts of data, including about non-verbal behaviour are collected, potentially allowing a handful of large tech companies to facilitate hyper-personalised profiling, advertising and increased surveillance, impacting children’s privacy, security, other rights and freedom.
- While the complete immersion in an alternate reality which Metaverse promises is still not here, there are multiple virtual environments and games that are not fully immersive, and yet indicative of dangers in coping with that world.
- For instance, in the hugely popular Grand Theft Auto, which does have adult and child versions, there is an instruction in the adult version to ‘approach a prostitute and spank her many times’. Now, adolescents are likely to pick the adult version.
- There were reports in the media about how children were using Artificial Intelligence to generate indecent child abuse images.
- Then there is the mental health aspect, with children facing the prospect of trauma, soliciting and abuse online, which can leave deep psychological scars that impact lives in the real world too.
- Innocuous and innocent sharing of images online can also be twisted by depraved predators. End-to-end encryption is essential to protect the information that children share online, points out Ms. Suresh.
What can be done to keep children safe online?
- The primary responsibility is that of the tech companies who will have to incorporate ‘safety by design’. The proceedings of the Congressional hearings have made it obvious that these companies are fully cognisant of the extent to which their apps and systems impact children negatively.
- Drawing on the Convention on the Rights of the Child, UNICEF offers guidance that lists nine requirements for child-centred AI, including support for children’s development and well-being, and protecting children’s data and privacy.
- UNICEF recommends that tech companies apply the highest existing data protection standards to children’s data in the metaverse and virtual environments.
- In addition, governments have the burden of assessing and adjusting regulatory frameworks periodically to ensure that such technologies do not violate children’s rights, and use their might to address harmful content and behaviour inimical to children online.