New eyeglasses allow you to adjust prescription yourself

first_img Explore further This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. The glasses were developed by four guys; Dave Crosby, Owen Reading, Richard Taylor and Greg Storey, who found a common interest in self-adjustable glasses and in the process created a company to fulfill the goal of providing low cost eyeglasses to the millions of people the world over who cannot afford a traditional pair. Their secondary market is for people who could use adjustable glasses for other than general use purposes, such as reading, working on a computer, knitting, i.e. for people as they get older and find they have trouble focusing while performing different tasks.The Eyejusters, which come with detachable thumb-dials also come with a plastic case and special cleaning cloth. The cloth can be used to clean both sides of both lenses because the outer lens can be hinged down for easy access. The Eyejusters also offer UV protection, which has been incorporated to help prevent eye damage from the sun, another common problem in underdeveloped countries.Another interesting aspect of the adjustable glasses is that it appears with a little tweaking, they could be used to perform self exams in more developed countries. By adding a digital display, the wearer could work out their own prescription and send it to a company that sells traditional eyeglasses, sidestepping an expensive trip to an ophthalmologist. Two pairs of specs in one: Touch of finger changes prescription ( — A new kind of eyeglasses is now available from a British company that allows the wearer to adjust the prescription anytime, anywhere, via small thumb-dials on the sides. Called, Eyejusters, the glasses make use of a technology called a Slidelens, which very aptly describes how these glasses do their magic. Each lens is actually two lenses that have slightly different shapes; turning the thumb-dial causes one lens to move slightly left or right and that changes the focal point for the wearer. The lenses are moved until the person doing the focusing finds the sweet spot; which is exactly how users focus a pair of binoculars.center_img Citation: New eyeglasses allow you to adjust prescription yourself (2012, May 31) retrieved 18 August 2019 from © 2012 Phys.Org The web site for the Eyejusters says their main market is the developing world, where a lot of people with vision problems can’t afford to see an eye doctor, much less the glasses that would be prescribed. Eyejusters solve both problems; when ordered, they come with an eye-chart that can be used to help discern if a person’s vision can be corrected with Eyejusters (the power range is from +4.5 to 0 D (positive) and 0 to -5.0 D (negative)) and to figure out which version they need (for near or far vision correction).last_img read more

Lack of diversity in pygmy blue whales not due to manmade cause

first_img Explore further Pygmy blue whale. Credit: © Research team (Attard et. al) Pygmy blue whale. Credit: © Research team (Attard et. al) Citation: Lack of diversity in pygmy blue whales not due to man-made cause (2015, May 6) retrieved 18 August 2019 from This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. Journal information: Biology Letters (—A team of researchers working in Australia has found via DNA analysis, that the lack of genetic diversity in pygmy blue whales is due to natural causes, not because of whaling. In their paper published in the journal Biology Letters, the researchers, affiliated with a variety of institutions in Australia, describe their study of the whales and why what they learned may help save them.center_img That is good news for the species, because it means their lack of diversity is not due to whaling, (which did reduce their numbers dramatically, along with other blue whales)—it is because they are still so new of a species. Also because they are now a protected species, the researchers believe that if they can mitigate other threats, such as different sorts of human pollution, the whales have a good chance of returning to their pre-whaling population. Australian blue whales now call Antarctica home © 2015 Pygmy blue whales are a subspecies of blue whales and contrary to their name, are not really all that small. They average 24 meters in length compared to their bigger cousins, which average 28 to 30 meters in length.For many years marine biologists have assumed that the lack of genetic diversity found in pygmy blue whales living around the shores of Australia, was due to their numbers being cut due to whaling. Now it appears that is not the case. In this new effort, the researchers sought to better understand the history of the whales and how they wound up with so little diversity. To learn more, they obtained DNA samples from several specimens and then studied patterns of genetic mutations—that allowed them to see that the whales had all come from just a few individuals, a “founder group,” beginning around 20,000 years ago. After comparing the pygmy whale DNA with other blue whales living in other parts of the world, the team was able to ascertain that they had gotten their start as Antarctic blue whales.The researchers note that 20,000 years ago, the Earth was experiencing peak glaciation, which allowed blue whales (currently the largest animal on the planet) to travel to other places, one of which was, apparently Australia. But then the glaciers retreated, leaving those that had migrated to adapt to their new environment—they evolved into the pygmy blue whales that exist today. More information: Low genetic diversity in pygmy blue whales is due to climate-induced diversification rather than anthropogenic impacts, Biology Letters, DOI: 10.1098/rsbl.2014.1037AbstractUnusually low genetic diversity can be a warning of an urgent need to mitigate causative anthropogenic activities. However, current low levels of genetic diversity in a population could also be due to natural historical events, including recent evolutionary divergence, or long-term persistence at a small population size. Here, we determine whether the relatively low genetic diversity of pygmy blue whales (Balaenoptera musculus brevicauda) in Australia is due to natural causes or overexploitation. We apply recently developed analytical approaches in the largest genetic dataset ever compiled to study blue whales (297 samples collected after whaling and representing lineages from Australia, Antarctica and Chile). We find that low levels of genetic diversity in Australia are due to a natural founder event from Antarctic blue whales (Balaenoptera musculus intermedia) that occurred around the Last Glacial Maximum, followed by evolutionary divergence. Historical climate change has therefore driven the evolution of blue whales into genetically, phenotypically and behaviourally distinct lineages that will likely be influenced by future climate change.last_img read more

Biology meet philology First application of phylogenetic evolutionary framework to color naming

first_img Journal information: Proceedings of the National Academy of Sciences (—That there are universal patterns in the naming of colors across languages has long been a topic of discussion in a range of disciplines, including anthropology, cognitive science and linguistics. However, previous color term research has not applied an evolutionary framework to the analysis of these worldwide patterns. Recently, scientists at Yale University traced the history of color systems in language by applying phylogenetic methods across a large language tree. They not only validated the phylogenetic approach to culture, but also generated a precise history of color terms across a large language sample drawn from the Pama-Nyungan languages of Australia, and moreover provided evidence supporting the loss and, as had been previously known, gain of color terms in the evolutionary process. Fig. 1. Evolutionary pathways of color term systems, after WCS. Haynie HJ, Bowern C (2016) Phylogenetic approach to the evolution of color term systems. Proc Natl Acad Sci USA 113(48):13666-13671. Fig. 2. Parameters in dependent models. Haynie HJ, Bowern C (2016) Phylogenetic approach to the evolution of color term systems. Proc Natl Acad Sci USA 113(48):13666-13671. More information: Phylogenetic approach to the evolution of color term systems, PNAS (2016) 113(48):13666-13671, doi:10.1073/pnas.1613666113Related 1Basic Color Terms: Their Universality and Evolution, (University of California Press, Berkeley, CA) 2The World Color Survey, (CSLI Publications, Stanford, CA) “Our key insights were that color terms could disappear from languages (as well as be innovated), but that this term loss is shaped by our cognitive systems. That is, our perception of the world disposes us to talk about things in certain ways. For example, humans aren’t very good at smelling (compared to hearing or seeing), so it’s not surprising that we don’t have a lot of smell-based vocabulary. The same is true for color – the color terms that are most frequently named are (with some simplification) those that refer to regions of the color spectrum that are maximally perceptually distinct from one another. It’s therefore not surprising that different languages would end up with similar color inventories. Our work shows that those same principles are at work in the ways in which color terms drop out of use.Bowern notes that the study is relevant to a number of related disciplines, including anthropology, psychology and linguistics. “Color naming has been a central question of all these disciplines for over 100 years now,” she points out. “Color was an early battleground around the idea that perception might shape language, and language perception. In essence, the debate revolved around cultural relativity and cultural differences, and whether languages that lexicalize the world in ways that differ significantly from their major European and Asian counterparts might yield crucial insights into human evolution and prehistory. “There’s also been discussion about the role of material culture in leading to color naming elaboration,” she adds. “An interesting example is purple, which in English originally referred to a particular type of ink, and was only later generalized as a color term that could refer to things other than ink. Our work shows how insights from historical linguistics give us a better idea about how these issues interact.”The pap theory1,2 play a substantial role in the development of color term systems, further study from an evolutionary perspective can refine our understanding of the interaction of cognitive constraints and language change in shaping lexical system.”Perhaps an analogy with numbers would be useful,” Bowern suggests. “It’s not surprising that so many languages have a base ten system for counting, because the ease of counting on fingers and toes makes that a very salient choice. However, languages change over time through sound and word change, and so in many languages we find clear base ten systems being obscured.” In English, she illustrates the relationship between the words three and thirty isn’t as clear as the relationship between eight and eighty. “We thought the same might be true for color – that is, while there might be cognitive constraints that lead humans to name colors in a certain way, we would also expect regular principles of language, particularly semantic, to apply – and indeed, that’s what we find: The cognitive constraints provide a framework for the types of colors that tend to appear, but languages acquire and change their color words from many sources. In short, color terms undergo broadening and narrowing, as other words do – but they do so within a framework that constrains the organization of the system.” Fig. 3. Ancestral state reconstructions on consensus tree. Haynie HJ, Bowern C (2016) Phylogenetic approach to the evolution of color term systems. Proc Natl Acad Sci USA 113(48):13666-13671. This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. © 2016 Prof. Claire Bowern discussed several key elements of the paper that she and Hannah J. Haynie published in the Proceedings of the National Academy of Sciences, these being:tracking the evolution of color terms across a large language tree in order to trace the color systems historyvalidating phylogenetic approaches to culture and providing an explicit history of color terms across a large language sample when working with the Pama-Nyungan languages of Australiafinding alternative trajectories of color term evolution beyond those considered in the standard theoriesapplying Bayesian phylogenetic methods to data that bridges linguistics, cultural anthropology, and cognitive science”We had a lot of trouble gtting data for the languages, as well as interpreting the data we had,” Bowern tells “Since there’s no preexisting set of color term datasets for these languages, we had to create it from grammars, dictionaries and field notes.” Moreover, she adds, if a term wasn’t in the dataset, they had to determine if it was truly absent from the language – or if it simply was not recorded in their data. The researchers also ran into difficulty deciding if a word was truly a color term, as was the case with orange, which had to be excluded from their list of colors because too many instances of the term referred only to the fruit.”There is now a substantial body of literature on phylogenetic approaches to culture,” Bowern continues. (Phylogenetics is the study of evolutionary relationships among species, individuals or genes, as modeled on trees.) “It’s still somewhat controversial, but the critiques of phylogenetics outside of biology often focus on the differences between biological systems and other areas of change, such as in linguistics or culture.” That said, she points out that oftentimes these critiques overlook the degree of difference among biological evolutionary processes. “It’s uncontroversial that language changes; it’s also uncontroversial that languages aren’t organisms. The point is that from Darwin on, language and biology have had a long history of cross-pollination and co-inspiration.” New study reveals that prelinguistic infants can categorize colors Since the paper addresses the links between perception, language, and the categorization of the natural world, asked Bowern if more detailed perception of the environmental color spectrum leads to corresponding terms (a structural evolutionary change in the ascending visual pathway), or if terminology comes first in agglutinative languages as a complex form combining a color with a function.”I assume that it’s driven by terminology,” Bowern says. “Remember that this is not about what can be perceived – people can perceive color differences that they don’t have words for, and perception varies between individuals even when the color lexicon does not. However, within terminological change, there might be several different ways that the change could proceed. Perhaps as a term becomes more specialized to a part of a color range, an additional term is analogically brought into the system. Or, conversely, as a term becomes analogically applied to color, another term’s range is contracted.” Bowern notes that the researchers did not find evidence in the Australian data for compound colors, such as light red.Another interesting question is the dissociation between color perception and language in color blindness. “This is a good illustration of why we should usually look at the population/language level, rather than at specific individual speakers. A language still contains a word for red and green, even if some fraction of the population cannot distinguish those colors. We still have a vocabulary for music, even though some people are tone deaf. However, one could imagine a hypothetical language or dialect where enough people are color blind that those colors come to be called by the same name – but that hasn’t happened in our data set.”Bowern’s research group currently has work-in-progress in several aspects of language change related to both Australian languages, and language documentation and change in other regions. Specifically, they’re investigating ways to use computer tools to produce sound and text-aligned recordings – that is, align a recorded story with and words in the corresponding transcription. “This is incredibly useful for language research,” she tells Bowern is also continuing with her work in describing Australian languages, which already has provided some surprises. “In working on Australian language trees we found that Australia has approximately 400 languages – far more than 250 languages as was previously thought.” Not surprisingly, then, the scientists used fairly standard techniques in phylogenetics to look at color terms to address these challenges. “We defined the problem as one in which we were comparing theories, and testing which of the theories were well-supported by our data,” she explains. “For example, we compared theories where languages only lost color terms to ones where they could only gain them, or where they could both gain and lose them.” They found that their data and models did not support either of the first theories, and moreover had to find ways to model our data that were computationally tractable: The number of possible parameters for computer models with seven color terms were more than could be calculated. Citation: Biology, meet philology: First application of phylogenetic evolutionary framework to color naming (2016, December 9) retrieved 18 August 2019 from Explore furtherlast_img read more

New earthquake forecasting system gave reliable forecasts of Italian aftershocks

first_imgAfter refining their system, the researchers entered real-world data from a major earthquake that occurred in Amatrice Italy in 2016 and from the series of aftershocks that followed. The aftershocks were so unusual that earthquake experts gave them their own name: the Amatrice-Norcia seismic sequence. After running the system, the researchers found that it “issued statistically reliable and skillful space-time-magnitude forecasts” for the event.While the results are promising, the researchers are quick to point out that their system is still in the pilot stage, and thus, it is still unclear how well it might perform when used to predict aftershock patterns that occur after future earthquakes. This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. Explore further More information: Warner Marzocchi et al. Earthquake forecasting during the complex Amatrice-Norcia seismic sequence, Science Advances (2017). DOI: 10.1126/sciadv.1701239AbstractEarthquake forecasting is the ultimate challenge for seismologists, because it condenses the scientific knowledge about the earthquake occurrence process, and it is an essential component of any sound risk mitigation planning. It is commonly assumed that, in the short term, trustworthy earthquake forecasts are possible only for typical aftershock sequences, where the largest shock is followed by many smaller earthquakes that decay with time according to the Omori power law. We show that the current Italian operational earthquake forecasting system issued statistically reliable and skillful space-time-magnitude forecasts of the largest earthquakes during the complex 2016–2017 Amatrice-Norcia sequence, which is characterized by several bursts of seismicity and a significant deviation from the Omori law. This capability to deliver statistically reliable forecasts is an essential component of any program to assist public decision-makers and citizens in the challenging risk management of complex seismic sequences. A damaged building at Accumuli by the Amatrice earthquake (August 24, 2016). Credit: Marco Anzidei Scientists have long sought forecasting methods for earthquakes, for obvious reasons—unfortunately, to date, little progress has been made despite hundreds of years of effort. Science has advanced one area of forecasting, however: aftershocks that occur after a major quake.It has been noted that aftershocks can occur in sequences that frequently follow a power series. Because of that, seismologists could make predictions of aftershocks based on the degree of severity of the preceding major quake. But, as the researchers note, there are exceptions—sometimes, aftershocks seemingly occur randomly. Occasionally, they even precede a large quake. It was these exceptions that the researcher sought to address with a new and improved earthquake prediction system.The system built by the team is called an Operational Earthquake Forecasting (OEF) system, and is based on three predictive models. The first two are models known as epidemic-type aftershock sequences (ETAS), which use clustering information to make predictions. The third was based on the short-term earthquake probability model (STEP), which, as its name implies, is math based. Study finds earthquakes can trigger near-instantaneous aftershocks on different faultscenter_img The village of Amatrice, completely destroyed by the central Italy seismic sequence of August-October 2016. Credit: Marco Anzidei Journal information: Science Advances © 2017 (—A trio of researchers with Istituto Nazionale di Geofisica e Vulcanologia in Italy has created an earthquake warning system that reliably predicted a series of aftershocks after a major quake in Italy. In their paper published on the open-access site Science Advances, the team describes the system and its accuracy in predicting the series of aftershocks that occurred during 2016 to 2017. Citation: New earthquake forecasting system gave reliable forecasts of Italian aftershocks (2017, September 14) retrieved 18 August 2019 from read more

Astronomers detect deep long asymmetric occultation in a newly found lowmass star

first_img More information: S. Rappaport et al. Deep Long Asymmetric Occultation in EPIC 204376071. arXiv:1902.08152 [astro-ph.SR]. Artist’s conception of the knife-edge dust sheet passing in front of EPIC 204376071. Credit: Danielle Futselaar; Citation: Astronomers detect deep, long asymmetric occultation in a newly found low-mass star (2019, March 5) retrieved 18 August 2019 from Explore further Dwarf companion to EPIC 206011496 detected by astronomers Located some 440 light years away, most likely in the Upper Scorpius stellar association, EPIC 204376071 is a young (about 10 million years old) M-star with a mass of about 0.16 solar masses and a radius of approximately 0.63 solar radii. The star has an effective temperature of nearly 3,000 K, luminosity of around 0.03 solar luminosities and a rotation period of 1.63 days.EPIC 204376071 was observed by NASA’s Kepler spacecraft twice during its prolonged mission known as K2. When the star was observed by Kepler for the second time, in late 2017, a group of astronomers led by Saul Rappaport of Massachusetts Institute of Technology (MIT) identified a single occultation-like event in the light curve of the object.The detected occultation lasted for about a day and what interested researchers the most was the fact that the event was extremely deep and quite noticeably asymmetric with an egress about twice as long as the ingress.”In this work, we report the discovery of a one-day-long, 80 percent deep, occultation of a young star in the Upper Scorpius stellar association: EPIC 204376071,” the astronomers wrote in the paper.The event blocked up to about 80 percent of the light for an entire day. Besides this one-day occultation and frequent flares, as well as low-amplitude rotational modulation, EPIC 204376071 turned out to be quiet for a total of 160 days of K2’s two observational campaigns.According to the paper, there are a few things that make the detected event unique. These are the continuous coverage with half-hour sampling of the flux, the very clearly mapped-out asymmetry in the occultation profile, and weak emission identified in the Wide-field Infrared Survey Explorer (WISE) 3 and 4 bands.The researchers concluded that such a deep eclipse with these properties cannot be explained by another star crossing EPIC 204376071. They assume that this unique, very deep, long dip in the lightcurve could be caused by orbiting dust or small particles. The second theory is that a transient accretion event of dusty material near the corotation radius of the star could be responsible for the observed occultation.”We have explored two basic scenarios for producing a deep asymmetric occultation of the type observed in EPIC 204376071. In the first, we considered an intrinsically circular disk of dusty material anchored to a minor body orbiting the host star. (…) Second, we considered a dust sheet of material of basically unknown origin, though we do assume that the source of the dust is in a quasi-permanent orbit about the host star,” the paper reads.However, the astronomers added that it is too early to draw final conclusions on which of the two hypotheses is true. More studies of EPIC 204376071 are needed, in particular, radial velocity measurements to search for evidence of an orbiting body, and adaptive optics observations to search for scattered light from disk structures or evidence of low-mass wide companions. An international team of astronomers has observed a deep, day-long asymmetric occultation in a recently detected low-mass star known as EPIC 204376071. In a research paper published February 21 on, the scientists detail their finding and ponder various theories that could explain such peculiar occultation. © 2019 Science X Network This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

IBM announces that its System Q One quantum computer has reached its

first_img Explore further Journal information: arXiv Quantum computers are, as their name implies, computers based on quantum bits. Many physicists and computer scientists believe they will soon outperform traditional computers. Unfortunately, reaching that goal has proven to be a difficult challenge. Several big-name companies have built quantum computers, but none are ready to compete with traditional hardware just yet. These companies have, over time, come to use the number of qubits that a given quantum computer uses as a means of measuring its performance—but most in the field agree that such a number is not really a good way to compare two very different quantum computers. IBM is one of the big-name companies working to create a truly useful quantum computer, and as part of that effort, has built models that they sell or lease to other companies looking to jump on the quantum bandwagon as soon as they become viable. As part of its announcement, IBM focused specifically on the term “quantum volume”—a metric that has not previously been used in the quantum computing field. IBM claims that it is a better measure of true performance, and is therefore using the metric to show that the company’s System Q One quantum computer advancement has been following Moore’s Law. Credit: IBM Credit: IBM Credit: IBM IBM says it’s reached milestone in quantum computing As part of its announcement, IBM published an overview of the results of testing several models of its System Q One machine on its corporate blog. One such metric, notably, was “quantum volume,” a metric created by a team at IBM, which is described as accounting for “gate and measurement errors as well as device cross talk and connectivity, and circuit software compiler efficiency.” The team that created the metric wrote a paper describing the metric and how it is calculated and uploaded it to the arXiv preprint server last November. In that paper, they noted that the new metric “quantifies the largest random circuit of equal width and depth that the computer successfully implements,” and pointed out that it is also strongly tied to error rates. More information: … ower-quantum-device/ IBM has announced at this year’s American Physical Society meeting that its System Q One quantum computer has reached its “highest quantum volume to date”—a measure that the computer has doubled in performance in each of the past two years, the company reports. Citation: IBM announces that its System Q One quantum computer has reached its ‘highest quantum volume to date’ (2019, March 5) retrieved 18 August 2019 from This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. © 2019 Science X Networklast_img read more

Alipurduar conducts weeklong drive to check midday meal quality

first_imgKolkata: The Alipurduar district administration recently undertook a week-long special drive for monitoring quality of mid-day meals in schools of the district. The special drive — ‘Poshan Nirikshon’ — involved inspection of 57 schools across 6 blocks of the district.The checking criteria covered as many as 28 parameters earmarked by the district administration covering issues such as menu board being displayed, headmaster tasting the food before serving, utensils and toilet availability etc. The drive was conducted after complaints regarding quality of the midday meal were received from schools. Also Read – Rain batters Kolkata, cripples normal life”We hope this programme will be instrumental in protecting children from classroom hunger, increasing school enrollment and attendance,” a senior official of the district administration said.Due to an acute staff crunch at Sub-Inspector level, the district administration initiated the special drive where officers of collectorate acted as supplementary inspecting officers. “The team will be giving us a feedback and on the basis of which measures will be taken for improvement,” the official added. According to sources, infrastructure deficits such as faulty handpumps, lack of separate toilets, boundary wall in schools, lack of dining space, absence of eggs in meals were found in a number of schools after the team interacted with a number of students. “On the basis of the feedback, we have directed the BDO’s in charge of the respective schools to complete the task of putting the remedial measures on a war footing.We have also showcaused a number of teachers, who have been found to be deviating from the required standard,” the official added.The initiative will gradually cover all the 1,635 schools of the district.last_img read more

KMDA forms Advisory Committee to monitor health of bridges

first_imgKolkata: Kolkata Metropolitan Development Authority (KMDA) on Wednesday constituted an Advisory Committee for monitoring the health of the bridges in the city. The move comes in the wake of the Majerhat Bridge mishap on September 4.It may be mentioned that the state government has already prepared a list of 15 bridges, which need structural health audit on a priority basis. The list includes Dhakuria Bridge connecting Golpark with Dhakuria, Bijon Setu connecting Gariahat with Kasba, Aurobindo Setu connecting Gouribari with Ultadanga, Chetla Bridge connecting Kalighat with Chetla, Kalighat Bridge connecting Kalighat with Gopalnagar, Durgapur Bridge connecting New Alipore with Chetla, Ultadanga Flyover connecting EM Bypass with VIP Road, Sukanta Setu connecting Jadavpur with Santoshpur, Bankim Setu connecting Howrah Maidan with Howrah Station, Chingrighata Flyover connecting EM Bypass with Salt Lake, Sealdah Flyover, Bridge Number 4 on Park Circus rail line, Jibanananda Setu near Jadavpur police station and the arch type bridge at Karunamoyee in Tollygunge. Also Read – Rain batters Kolkata, cripples normal lifeAccording to sources in KMDA, the advisory committee will have Amitava Ghosal, a distinguished civil engineer and a member of the Technical Advisory Group for a number of new railway projects, Professor Sriman Bhattacharya or Professor Nirjhar Dhong or both from the Civil Engineering department of IIT Kharagpur, Asish Kumar Sen, chief engineer of KMDA, Bhaskar Sengupta, chief engineer (Roads and Bridges) KMDA and Samiran Sen, independent consultant and fellow of the Institute of Engineers. Also Read – Speeding Jaguar crashes into Mercedes car in Kolkata, 2 pedestrians killedThe committee will recommend the names of the agencies who will carry out structural health audits with immediate basis for a number of bridges and structures, which are in bad condition. “It will assist the authorities in interpreting the reports of the agencies and arrive at a final action plan and also suggest monitoring activities and finalise the terms of reference for the studies to be carried out, including tests to be performed. It will further recommend instrumentation that may be feasible for immediate monitoring of the bridges,” a KMDA source said.last_img read more

Youths held for assault of cop

first_imgKolkata: Two youths were arrested for allegedly assaulting a ticket inspector of West Bengal Transport Corporation (WBTC) and a police officer on Friday morning at Karunamoyee in Salt Lake. The youths were later arrested and were remanded to judicial custody till August 22 after they were produced before the Bidhannagar Court.According to police, two siblings, identified as Arijit Gupta and Soumyajit Gupta, were travelling in a bus from S-9 route on Friday morning. During the journey, a ticket inspector of WBTC, identified as Narayan Chandra Guha, boarded the bus and was checking tickets of the passengers. When he asked the duo to show their tickets, the duo stated that they haven’t purchased any tickets. When Guha asked them to purchase the tickets immediately Arijit and Soumyajit refused and used filthy languages at Guha. Also Read – Rs 13,000 crore investment to provide 2 lakh jobs: MamataHowever, Guha remained silent till the bus arrived at Karunamoyee bus terminus. There he told the duo that not purchasing a valid journey ticket would attract fine. Hearing this again the duo used abusive words. This time, when Guha protested he was allegedly assaulted by the siblings.Immediately they were detained and Bidhannagar North police station was informed. After a few minutes, an Assistant Sub-Inspector (ASI) of police identified as Arunava Pan reached the spot and intervened. Also Read – Lightning kills 8, injures 16 in stateAfter hearing about the situation, he reportedly asked the duo to pay the fine and leave. Hearing this the youths went furious and assaulted Pan with fists and blows. As a result, Pan suffered serious injuries and was rushed to Bidhannagar Sub Divisional Hospital where he was treated and discharged. Later, Guha lodged a complaint and the duo was arrested on charges of obstructing public servant in discharge of public functions, voluntarily causing hurt to deter public servant from his duty and assault or criminal force to deter public servant from discharge of his duty.last_img read more

The Real Reason Ian McKellen Turned Down the Role of Dumbledore in

first_imgSir Ian McKellen is a living legend. But when he was asked “Is there a role you’ve ever turned down because it was too puerile, too silly, just…?” by Stephen Sackur, host of the BBC’s HARDtalk. He replied: “About once a weak, yes. Oh yeah, lots of stuff.” After Richard Harris passed away in 2002 of Hodgkin’s Disease, Ian McKellen was offered the chance to portray Professor Dumbledore in the Harry Potter series. He was already playing a famed archetypal wizened wizard in a different book adaptation but could have easily inhabited yet another one. Still, he said no.Only recently and during the 20th-anniversary episode of a British talk show did the actor reveal the real reason as to why he declined one of the greatest offers that fell unexpectedly in his lap.Ian McKellen (L) as Gandalf with Elijah Wood as Frodo. (Photo by New Line/WireImage/Getty Images)“When {Richard Harris] died–he played Dumbledore, the wizard–I played the real wizard, of course,” said McKellen, referring to his career-defining portrayal of Gandalf in the Lord of the Rings trilogy. “When they called me up and said would I be interested in being in the Harry Potter films, they didn’t say what part, but I worked out what they were thinking, and I couldn’t.”A former rugby player, a prolific stage, and screen actor, as well as a great singer, Richard Harris wasn’t a knight like his colleague but lived to be acknowledged as perhaps the best Irish actor that has ever lived.He acted in about 70 movies over the course of his 50-years-long illustrious career. Highly critical of them as he was, Harris said not long before he died that “sometimes you have to make a low standard of film to sustain a high standard of living.”Richard Harris 1985 Photo by City of Boston Archives CC BY 2.0But the films Harris acted in during the later stages of his life were not of low standard. On the contrary, they were nothing short of spectacular and his performance brilliant.Think Gladiator and his portrayal of Marcus Aurelius, the Roman emperor, or Harris as Abbé Faria, the priest-philosopher made political prisoner who was thrown into Chateau d’If, the prison off the coast of Marseilles, where he was counting stones and digging holes alongside Edmond Dantes (Jim Caviezel) in Dumas’s film adaptation of The Count of Monte Cristo.The first photo of Jude Law as Dumbledore is here!And as Dumbledore, the headmaster of the wizarding school of Hogwarts, Richard Harris was just perfect.This undated file photo shows Irish actor Richard Harris in the role of Professor Dumbledore in the US film ‘Harry Potter’. (Photo by AFP/AFP/Getty Images)However, it was not meant to last, and much as he was invaluable to the story and irreplaceable to the franchise, he simply had to be replaced by someone else–which turned out to be even more problematic than the filmmakers first feared.Eventually, the role went to Michael Gambon. But before that decision, it was a challenge of recasting.Michael Gambon Photo by IamIrishwikiuser CC BY-SA 3.0“The role is so fundamental to the character and narrative of the movies, and was played so beautifully by the late Richard Harris, that the studio and filmmakers intend to make a very careful and considered choice in casting the next actor to embody the headmaster of Hogwarts School,” a spokeswoman for Warner Bros. declared after the actor died.Harris had only taken part in Harry Potter and the Philosopher’s Stone in 2001, and Harry Potter and the Chamber of Secrets the year after.Everyone saw Harris’s lifelong friend Peter O’Toole as the prime candidate to take on the role, and reportedly the actor’s family members were eager to see O’Toole as Dumbledore and wave the Elder Wand as his friend did.Peter O’Toole – 1968The two men being close and almost the same age seemed like it would make this the perfect choice. However, the years were catching up with O’Toole, so there were worries if he would be able to physically endure the six remaining films. Also, although never officially revealed, it is safe to assume that it would be tough for an actor to feel like he needs to live up to or reproduce his close friend’s brilliant acting performance.McKellen, the 78-year-old star, now a wizard with experience but in a different franchise, was then offered the chance to replace Harris, but he couldn’t out of sheer respect towards himself and the other actor. Also, McKellen apparently turned down the lucrative and pretty tempting offer claiming it was because he knew Harris didn’t think very much of his acting ability:“I couldn’t take over the part from an actor who I’d known hadn’t approved of me,” explained McKellen during the interview.Ian McKellen at the 2013 San Diego Comic-Con International. Photo: Gage SkidmoreCC BY-SA 2.0As much as Harris was critical about his own work, he was just the same if not even more so about the work of others. He once described McKellen, Sir Derek Jacobi, and Kenneth Branagh as “technically brilliant but passionless” actors.“They are technically brilliant, like Omega watches, but underneath they are hollow because their lives are hollow,” he said about them according to The Telegraph, which the actor verified by saying, “I was in a good company, yeah,” referring to himself as being part of a bunch accused of being passionless.Seemingly unfazed by it, he continued by saying, “Nonsense,” smiled, and moved on to the next question asked by his host: “You could have been Dumbledore?” McKellen grinned but gave no audible answer.Ian McKellen In Lord of The Rings (Getty Images)“When I see the posters of Mike Gambon, who gloriously played Dumbledore, I think sometimes it’s me. We get asked for each other’s autographs,” he said before concluding the interview by confirming that when his days on Earth come to an end, he would be amused if his gravestone read “Here Lies Gandalf.”Related story from us: WWII veteran Christopher Lee corrected “Lord of the Rings” director Peter Jackson on what people do when they’re knifed based on his WW2 serviceFor, as McKellen says, it is the only character he has ever played that provided him with the opportunity to “be in contact with lots of people, especially young ones from all over the world, that I couldn’t possibly know about and they let me into their lives to an extent,” emphasizing that it was a privilege to him “to be allowed to impersonate a character that already was in the zeitgeist and meant a great deal as an example of how to behave in the world.”last_img read more