Special Marriage Act
(GS-I: Salient features of Indian Society, Diversity of India)
Recently several interfaith couples including some celebrities chose to marry under a secular personal law i.e. through the Special Marriage Act, of 1954.
About Special Marriage Act, 1954:
The Special Marriage Act of 1954 (SMA) was passed by Parliament on October 9, 1954. It governs a civil marriage where the state sanctions the marriage rather than the religion.
Need for such an act:
Laws, such as the Muslim Marriage Act, of 1954, and the Hindu Marriage Act, of 1955, require either spouse to convert to the religion of the other before marriage.
However, the SMA enables marriage between inter-faith or inter-caste couples without them giving up their religious identity or resorting to conversion.
Who can get married under the Special Marriage Act?
The applicability of the Act extends to the people of all faiths, including Hindus, Muslims, Sikhs, Christians, Sikhs, Jains, and Buddhists, across India.
The minimum age to get married under the SMA is 21 years for males and 18 years for females.
What is the procedure for a civil marriage?
As per Section 5 of the Act, the parties to the marriage are required to give a notice, in writing, to a “Marriage Officer” of the district in which at least one of the parties has resided for at least 30 days immediately preceding the notice.
The parties and three witnesses are required to sign a declaration form before the Marriage Officer.
Once the declaration is accepted, the parties will be given a “Certificate of marriage” which is essentially proof of the marriage.
What are hallucinating chatbots?
(GS-III: Science and Technology- Developments and their Applications and Effects in Everyday Life)
Google warned of the pitfalls of artificial intelligence (AI) in chatbots, as it can sometimes lead to “hallucination”.
Background: These reports emerged as OpenAI (ChatGPT), Google (Bard), and Microsoft (Bing’s beta) were opening up their AI-enabled chatbots for test users.
What are hallucinating chatbots?
AI chatbots are trained to have human-like conversations using a process known as natural language processing (NLP).
With NLP, they can interpret human language as it is written, which enables them to operate more or less on their own.
Hallucination in AI chatbots is when a machine provides convincing but completely made-up answers (untrue facts). It is not a new phenomenon.
For example, after being live on Twitter for just 24 hours in 2016, Microsoft’s chatbot Tay started parroting racist and misogynistic slurs back at users.
Why do AI chatbots start hallucinating?
Because these models require the capability to rephrase, summarise and present intricate tracts of text without constraints.
Facts, not being sacred, can be treated in contextual form when sifting through the information, as these models are unable to distinguish between contextual information and facts.
For example, when asked, “What does Albert Einstein say about black holes?” AI models can return a quote made famous on the Internet rather than factual information.
The problem becomes acute when complex grammar source material is used.
The biggest challenge: Identification of hallucinated texts
A method to recognize hallucinated output and include filters in AI models to recognize and eliminate such texts is being researched.
Tabulate and collect hallucinated texts from AI models.
A preliminary report by the Aircraft Accident Investigation Commission of Nepal on the crash of a Yeti Airlines ATR 72-500 brought to light that the propellers of the plane were found in an unusual “feathered” position.
What is ‘feathering’?
In simple terms,
Sometimes an aeroplane’s engine stops working while it’s flying. In an aeroplane with a propeller, the pilot can change the angle of the propeller blades so they cut through the air more efficiently. This is called “feathering” the propellers.
Feathering helps the aeroplane glide farther and stay at a safe speed. When pilots have to make an emergency landing, they often use feathering to help them. If the propellers keep spinning even when the engine is off, it can be dangerous if the aeroplane is flying low, but it can actually help restart the engine if the aeroplane is flying high.
Jal Jan Abhiyan
PM virtually inaugurates Jal Jan Abhiyan in Rajasthan.
About the campaign:
It is jointly run by the Ministry of Jal Shakti, and the Brahma Kumaris organization.
The campaign is aimed at creating collective consciousness among people towards water conservation
Important quotes by PM on water:
Water scarcity is being seen as a future crisis all over the world
Water security is a huge question for India due to its large population.
In the Amrit Kaal, India is looking towards the water as the future. There will be a tomorrow if there is water
Other Campaign for water:
About The Brahma Kumaris:
It is a spiritual, educational and Philanthropic movement that originated in Hyderabad, Sindh, during the 1930s. The Brahma Kumaris movement was founded by Lekhraj Kripalani. The organisation is known for the prominent role that women play in the movement.