Best UPSC IAS Coaching Academy in Chennai – UPSC/IAS/IPS/IRS/IFS/TNPSC

Blog

Daily Current Affairs 10.05.2022 (Rupee hits a new low amid dollar outflow, The search algorithm in action, The standard model of particle physics gets a jolt)

image-28

1. Rupee hits a new low amid dollar outflow

The rupee on Monday fell to an all-time low of 77.44 against the U.S. dollar due to a sell-off in equities amid concerns over weakening global growth prospects, outflow of the dollar and on fears of further monetary tightening by central banks to counter rising inflation. The previous closing low for the rupee was 77.09, seen on March 7.

“A sell-off in the global equity markets which was triggered by the hike in interest rates by the U.S. Federal Reserve, the war in Europe and growth concerns in China due to the COVID-19, led to the rupee depreciation,” Emkay Global Financial Services said.

2. The search algorithm in action

Why are algorithms used by dominant search engines becoming a privacy threat? Do algorithms ‘filter’ the internet according to user patterns?

An algorithm, essentially, is a series of instructions. Tech giant Alphabet Inc’s Google, whose flagship product is the Google search engine, is the dominant player in the search market.

Algorithms are often built using historical data and for specific functions. Once developed, they go through frequent updates from companies for quality enhancement. Any changes or updates in their algorithms could also mean that traffic is steered away from certain sites and businesses, which could have a negative effect on their revenue.

Firms can use these algorithms to change the way they rank products on websites, prioritising their own products and excluding competitors. Some of these concerns have caught the eye of regulators.

The story so far: Algorithms play a crucial role for search engines as they process millions of web searches every day. With the quantity of information available on the internet growing steadily, search algorithms are becoming increasingly complex, raising privacy and other concerns and drawing the attention of regulators. Last month, U.K.’s digital watchdog said they will take a closer look at algorithms, seeking views on the benefits and risks of how sites and apps use algorithms, as well as inputs on auditing algorithms, the current landscape and the role of regulators.

How do search algorithms work?

An algorithm, essentially, is a series of instructions. It can be used to perform a calculation, find answers to a question or solve a problem. Search engines use a number of algorithms to perform different functions prior to displaying relevant results to an individual’s search request.

Tech giant Alphabet Inc’s Google, whose flagship product is the Google search engine, is the dominant player in the search market. Its search engine provides results to consumers with the help of its ranking systems, which are composed of a broad set of algorithms, that sort through web pages in its search index to find the most appropriate results in quick time. Its search algorithms consider several factors, including the words and expressions of a user’s query, relevance and usability of pages, expertise of sources, and the user’s location and settings, according to the firm.

While Google captures a significant chunk of the general search market, there are alternative search engines such as Microsoft’s Bing and DuckDuckGo available for users to explore. The latter, a privacy-focused search engine, claims it does not collect or share users’ personal information.

In January, market leader Google generated 61.4% of all core search queries in the U.S., according to database company Statista. During the same period of time, Microsoft sites handled a quarter of all search queries in the U.S.

As the algorithms used to deliver results would vary from one search engine to another, when a user inputs a query, the results would also differ. Moreover, results from different users would be rarely similar, even when searching for the same things, since the algorithms take into account multiple factors, like their location.

How are they developed?

Algorithms are often built using historical data and for specific functions. Once developed, they go through frequent updates from the companies to enhance the quality of search engine results presented to users. Most large search engine providers also bank on machine learning to automatically improve their users’ search experience, essentially by identifying patterns in previous decisions to make future ones.

Over the years, Google has developed search algorithms and updated them constantly, with some major updates like Panda, Penguin, Hummingbird, RankBrain, Medic, Pigeon, and Payday, meant to enhance some function or address some issue. In March, it introduced another update to improve the search engine’s ability to identify high-quality product reviews.

Search engines exert huge control over which sites consumers can find. Any changes or updates in their algorithms could also mean that traffic is steered away from certain sites and businesses, which could have a negative effect on their revenue.

What are the concerns?

The search giant’s trackers have allegedly been found on majority of the top million websites, as per a DuckDuckGo blog post. “This means they are not only tracking what you search for, [but] they’re also tracking which websites you visit, and using all your data for ads that follow you around the internet,” it added.

According to a Council of Europe study, the use of data from profiles, including those established based on data collected by search algorithms and search engines, directly affects the right to a person’s informational self-determination. Most of Google’s revenues stem from advertisements, such as those it shows consumers in response to a search query.

DuckDuckGo, in addition to providing an alternative to Google’s search engine, offers mobile apps and desktop browser extensions to protect users’ privacy while browsing the web. The privacy-focused firm, in a blog post, said that editorialised results, informed by the personal information Google has on people (like their search, browsing, and purchase history), puts them in a “Filter Bubble” based on what Google’s algorithms think they are most likely to click on.

What’s the current state of these algorithms?

These search algorithms can be used to personalise services in ways that are difficult to detect, leading to search results that can be manipulated to reduce choice or artificially change consumers’ perceptions.

Additionally, firms can also use these algorithms to change the way they rank products on websites, prioritising their own products and excluding competitors. Some of these concerns have caught the eye of regulators and as a result these search algorithms have come under their scrutiny.

The European Commission has fined Google €2.42 billion for abusing its market dominance as a search engine by giving an illegal advantage to another Google product, its comparison-shopping service.

Moreover, under the Commission’s proposal on the Digital Services Act, transparency measures for online platforms on a variety of issues, including the algorithms used for recommending content or products to users are expected to come into force.

“Majority of algorithms used by private firms online are currently subject to little or no regulatory oversight,” U.K.’s Competition and Markets Authority has said earlier in a statement, adding that “more monitoring and action is required by regulators.”

3. The standard model of particle physics gets a jolt

What does the CDF measurement of W boson mass imply for physics?

On April 7, researchers from Collider Detector at Fermilab (CDF) Collaboration, in the U.S., announced that they have made a precise measurement of the mass of the W boson. They stated that this precisely determined value did not match with the estimates from the standard model of particle physics.

The recent experiment which measured the mass of the W boson as 80,433.5 +/- 9.4 Mev/c2 is more than what is expected from the standard model. The expected value using the standard model is 80,357 +/- 8 MeV/c2. This implies the incompleteness of the standard model description.

This mass discrepancy of the W boson needs to be checked and confirmed to the same accuracy by other research facilities.

The story so far: On April 7, researchers from Collider Detector at Fermilab (CDF) Collaboration, in the U.S., announced, through a paper in Science, that they have made a precise measurement of the mass of the so-called W boson. They stated that this precisely determined value did not match with what was expected from estimates using the standard model of particle physics. This result is highly significant because this implies the incompleteness of the standard model description. This is a major claim, since the standard model has been extraordinarily successful in the past decades. Hence, physicists are looking for corroboration from other, independent, future experiments.

What is the standard model of elementary particle physics?

The standard model of elementary particles is a theoretical construct in physics that describes particles of matter and their interaction. It is a description that views the elementary particles of the world as being connected by mathematical symmetries, just as an object and its mirror image are connected by a bilateral (left–right) symmetry. These are mathematical groups generated by continuous transformations from, say, one particle to another. According to this model there are a finite number of fundamental particles which are represented by the characteristic “eigen” states of these groups. The particles predicted by the model, such as the Z boson, have been seen in experiments and the last to be discovered, in 2012, was the Higgs boson which gives mass to the heavy particles.

Why is the standard model believed to be incomplete?

The standard model is thought to be incomplete because it gives a unified picture of only three of the four fundamental forces of nature — electromagnetic, weak nuclear, strong nuclear and gravitational interactions — it totally omits gravity. So, in the grand plan of unifying all forces so that a single equation would describe all the interactions of matter, the standard model was found to be lacking.

The other gap in the standard model is that it does not include a description of dark matter particles. So far these have been detected only through their gravitational pull on surrounding matter.

How are the symmetries related to particles?

The symmetries of the standard model are known as gauge symmetries, as they are generated by “gauge transformations” which are a set of continuous transformations (like rotation is a continuous transformation). Each symmetry is associated with a gauge boson.

For example, the gauge boson associated with electromagnetic interactions is the photon. The gauge bosons associated with weak interactions are the W and Z bosons. There are two W bosons — W+ and W-.

Inspired by the success of quantum electrodynamics, in the sixties, Sheldon Glashow, Abdus Salam and Steven Weinberg developed the similar but more general, ‘electroweak’, theory in which they predicted these three particles and how they mediated the weak interactions. They were given the Nobel prize for their efforts in 1979. The W boson was first seen in 1983 at CERN, located in the Franco-Swiss border. Unlike the photon, which is massless, the W bosons are quite massive, which results in the force they mediate — the weak force — being very short ranged.

Unlike the photon, which is electrically neutral, the W-plus and W-minus are both massive and charged. By exchanging such W bosons, a neutron can change into a proton, for example. This is what happens in beta decay, a radioactive interaction that takes place in the sun. Thus, the W boson facilitates the interactions that make the sun burn and produce energy.

What is the main result of the recent experiment? What is the discrepancy they obtained?

The recent experiment at CDF, which measured the mass of the W boson as 80,433.5 +/- 9.4 Mev/c2, which is approximately 80 times the mass of a hydrogen nucleus, showed this to be more than what is expected from the standard model. The expected value using the standard model is 80,357 +/- 8 MeV/c2 . This is estimated from a combination of analytical calculations and high-precision experimental observation of a few parameters that go into the calculation like the W boson mass, strength of the electromagnetic interaction, Fermi constant, Higgs boson mass and Top quark mass. Thus, the W boson mass itself is a prediction of the standard model. Therefore, any discrepancy in its mass means a lack of self-consistency in the standard model.

However, this is not the last word, as the mass discrepancy of the W boson needs to be checked and confirmed to the same accuracy by other facilities, for example, the Large Hadron Collider (LHC).

Where do we stand now in terms of new physics?

New physics is in the air, and experiments have been gearing up for some years now to detect new particles. The Large Hadron Collider itself has been revamped for “Run3” that will carry out special experiments to look for physics beyond the standard model. A Perspective article by Claudio Campagnari and Martijn Mulders in Science points out several high-precision experiments which are in the pipeline such as the International Linear Collider in Japan, the Compact Linear Collider and the Future Circular Collider in CERN, the Circular Electron-Positron Collider in China etc. With its high-precision determination of the W boson mass, the CDF has struck at the heart of the standard model, so it is a significant finding and if this is confirmed by the LHC and other experiments, it will throw open the field for ideas and experiment.

Facebook
Twitter
LinkedIn
Pinterest
Picture of kurukshetraiasacademy

kurukshetraiasacademy

Leave a Reply

Your email address will not be published. Required fields are marked *