Improved Superconductivity in MultiWalled Carbon Nanotubes

first_img Physicists found the temperature at which carbon nanotubes become superconductors To be fair, observing a supercurrent through carbon nanotubes is not a new discovery. But past studies, which have used ropes made of single-walled carbon nanotubes (those consisting of just one cylinder rather than several nested cylinders), have only been able to achieve superconductivity by deep-freezing the nanotubes down to about 0.4 degrees Kelvin (K). Such an ultra-low “critical temperature,” as it’s called — just fractions of a degree away from 0 K, the coldest temperature possible — is very difficult to achieve and maintain in a laboratory.“In our study, the nanotubes superconducted at a much more manageable critical temperature of 12 K,” said Aoyama Gakuin University scientist Junji Haruyama. Haruyama is the lead author of the paper describing the work, which appears in the February 10, 2006, online edition of Physical Review Letters. “While 12 K is still extremely cold by everyday standards, it requires far less work to sustain. Also, in terms of potential applications of superconducting nanotubes, such as quantum molecular computing, this higher temperature is far more promising.”The scientists measured the supercurrent through the nanotubes by creating arrays of nano-sized electric “junctions” — very thin conducting layered structures. They began with a layer of aluminum, prepared such that it contained a grid of nanoscale pores. On top of this they deposited a layer of MWNTs, which inserted themselves vertically into the aluminum pores. Finally, they topped the nanotubes with a layer of gold.The group created three of these arrays. By carefully cutting off part of the nanotube layer, they created an array in which the nanotubes were flush with the aluminum surface and another in which the nanotubes jutted out slightly above the surface. For the third array, no cutting was done. As a result, each nanotube remained longer than the depth of each pore, and thus “spilled” over onto the aluminum.These three cases correspond to a different degree of nanotube-gold contact, referred to as “end bonding.” In the first array the nanotubes are only slightly end bonded with the gold, while in the third they are fully end bonded.End bonding turned out to be one important factor affecting the nanotubes’ ability to superconduct. Only the array containing entirely end-bonded MWNTs exhibited superconductivity at 12 K. Because the nanotubes were folded over, the gold could only make contact with the outer shell of each nanotube, rather than also bonding with the inner shells. However, in this third case the gold touched far more nanotube surface area.“We concluded that being entirely end bonded with the gold electrically activated all the shells in each nanotube,” said Haruyama. “In the other two arrays, only some of the shells were activated. This indicates that superconductivity in MWNTs is strongly related to the number of electrically active shells and, by extension, that electric interactions between shells play a large role.”Haruyama and his colleagues are planning several follow-up studies. These include an experiment that will attempt to increase the critical temperature of the nanotubes, as well as an investigation into how coupling neighboring nanotubes in the array’s MWNT layer could affect their superconductivity.Citation: “Superconductivity in Entirely End-Bonded Multiwalled Carbon Nanotubes”, Phys. Rev. Lett. 96, 057001 (2006)By Laura Mgrdichian, Copyright 2006 PhysOrg.com Citation: Improved Superconductivity in Multi-Walled Carbon Nanotubes (2006, March 13) retrieved 18 August 2019 from https://phys.org/news/2006-03-superconductivity-multi-walled-carbon-nanotubes.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.center_img A group of researchers from several institutions in Japan has observed superconductivity — a phenomenon in which electrons flow with no resistance — in billionth-of-a-meter sized cylindrical carbon molecules known as “multi-walled carbon nanotubes.” The nanotubes’ ability to superconduct adds to their many intriguing electrical and physical characteristics. Moreover, it increases the likelihood that carbon nanotubes will one day drastically improve electronics, building materials, and many other products. Explore furtherlast_img read more

Former physicist investigates May 6 flash crash

first_img Citation: Former physicist investigates May 6 flash crash (2010, September 22) retrieved 18 August 2019 from https://phys.org/news/2010-09-physicist.html More information: via: The New York Times Wall Street’s super traders come under fire This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. The daily chart of the Dow during the flash crash. Image credit: Wikipedia. Explore further Gregg Berman, a former particle physicist with 16 years of experience on Wall Street, is working for the Securities and Exchange Commission (S.E.C.) to lead a team of more than 20 investigators to find out what caused the flash crash. Over the past several months, the investigators have gathered and analyzed large amounts of data and interviewed hundreds of companies, and now plan to publish a report of their findings within the next two weeks. Although Berman isn’t revealing details about the results, in an article in The New York Times he says that he found no evidence of a deliberate attempt to manipulate the markets. Instead, the investigators have identified a specific series of events that led up to the crash.“The report will clearly demonstrate how market conditions and events prior to the flash crash led to the extreme price moves,” Berman said, adding that the overall picture may not be as simple and straightforward as many people would like.In the weeks and months after the flash crash, a number of theories have been proposed to explain the volatile activity. One of the first theories was the “fat finger,” in which someone accidentally sold an extremely large order, triggering more selling. Another theory was that the European debt crisis had made the markets volatile, which caused some high-frequency automated trading programs to stop trading, which in turn caused confusion among other computerized programs. Regulators now consider both of these theories to be doubtful. In contrast with these simpler explanations, Berman said that his explanation involves several factors occurring simultaneously. He explained that market participants were doing different things for different reasons, and that the varying “circuit breaker” policies of different exchanges caused a lack of coordination and confusion, all of which played a role in the crash.The results of the report will likely be used by a group of advisers to the S.E.C. and the Commodity Futures Trading Commission to make policy recommendations. The results and subsequent policy changes may also reassure investors, who have withdrawn money from stock mutual funds every week since the flash crash. © 2010 PhysOrg.com (PhysOrg.com) — Ever since the “flash crash” of May 6, 2010, investors have been wondering exactly what happened that Friday afternoon. As stock markets were trending down due to concern about the debt crisis in Greece, the Dow Jones Industrial Average suddenly plunged 600 points in about 5 minutes. Then 20 minutes later, the Dow recovered most of the loss, and finished the day down more than 200 points.last_img read more

Survey shows physicists cant agree on fundamental questions about quantum mechanics

first_img New light shed on old dispute between Einstein and Bohr Quantum mechanics at its heart is the study of the building blocks of the universe – what they are and how they work together to form reality as we are able to interpret it. Its ideas were first developed almost a century ago with such notables as Albert Einstein and Niels Bohr developing theories and debating ideas such as whether particles exist at certain places at certain times, or whether they move around constantly with a probability of being someplace at a given moment. The second idea famously led Bohr to conclude that if that were the case than the universe is indeterminate and at its base probabilistic. Refusing to believe such a possibility could be true, Einstein responded with perhaps his most famous quote that “God does not play dice with the universe.” Now, nearly a century later, modern physicists are still just as divided. In the survey, just 42 percent of respondents agreed with Bohrs’ assertions – the rest were divided among several other theories. Also likely surprising to those outside the physics community, a full 64 percent of those who bothered to respond to the survey said they believe Einstein’s view of the universe “is wrong.”Another idea that appears to still vex the modern physicist is whether quantum objects have the same physical properties as they do when measured. Just over half thought so. Also there is the ongoing argument about the probability of a true quantum computer coming to pass, and if it ever does, when that might happen. The largest number, 42 percent said they believe it will happen 10 to 25 years from now, 30 percent said it would come after that, while just 9 percent said they thought it might happen before then.Based on the results of the survey, it appears Richard Feynman was right when he once responded to a reporter’s question about how well quantum mechanics is understood by saying that “anyone who claims to understand quantum theory is either lying or crazy.” (Phys.org)—A trio of physicists has uploaded a paper to the preprint server arXiv describing the results of a survey passed out to attendees at a physics conference held in 2011: Quantum Physics and the Nature of Reality. The purpose of the survey was to find out how much agreement or disagreement there is in the physics community regarding the most fundamental ideas of quantum mechanics – surprisingly, the results showed that there is still very little consensus among physicists regarding some of its most basic principles. © 2013 Phys.org More information: A Snapshot of Foundational Attitudes Toward Quantum Mechanics, arXiv:1301.1069 [quant-ph] arxiv.org/abs/1301.1069AbstractFoundational investigations in quantum mechanics, both experimental and theoretical, gave birth to the field of quantum information science. Nevertheless, the foundations of quantum mechanics themselves remain hotly debated in the scientific community, and no consensus on essential questions has been reached. Here, we present the results of a poll carried out among 33 participants of a conference on the foundations of quantum mechanics. The participants completed a questionnaire containing 16 multiple-choice questions probing opinions on quantum-foundational issues. Participants included physicists, philosophers, and mathematicians. We describe our findings, identify commonly held views, and determine strong, medium, and weak correlations between the answers. Our study provides a unique snapshot of current views in the field of quantum foundations, as well as an analysis of the relationships between these views. Citation: Survey shows physicists can’t agree on fundamental questions about quantum mechanics (2013, January 23) retrieved 18 August 2019 from https://phys.org/news/2013-01-survey-physicists-fundamental-quantum-mechanics.htmlcenter_img Explore further Credit: arXiv:1301.1069 [quant-ph] Journal information: arXiv This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

SCHAFT team tops scores at DARPA Robotics Challenge

first_imgThe company has its roots in the University of Tokyo’s JSK Laboratory. Interestingly, earlier this year IEEE Spectrum called attention to SCHAFT as a Japanese startup that had announced a breakthrough in motor technology that may bypass the limitations of existing systems. The April article said the company had developed a kind of actuator that may make robotic muscles much stronger. IEEE Spectrum also remarked that the DARPA challenge “will be a great opportunity for SCHAFT to show off its innovative motors. A good performance at the competition would compel the next generation of humanoid robots, in Japan and elsewhere, to adopt the technology.” © 2013 Phys.org (Phys.org) —For those wondering who of 16 competing teams would walk away as top performers in the two-day DARPA Robotics Challenge in Florida over the weekend, the suspense is over. SCHAFT, a Japanese company newly acquired by Google, won the most points, 27 out of a possible 32. SCHAFT outscored some formidable big-name contenders such as MIT, Carnegie Mellon, and NASA. IHMC Robotics placed second. Third place went to Tartan Rescue, from Carnegie Mellon University, and fourth place was awarded to a team from the Massachusetts Institute of Technology. The fifth-place went to RoboSimian, designed by NASA’s Jet Propulsion Laboratory. Explore further This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. All in all, there were eight top scorers. Team TRACLabs, WRECS (Worcester Polytechnic Institute) and Team TROOPER (Lockheed Martin) were the next three. The eight teams now have the opportunity to continue their work with the help of Defense Advanced Research Projects Agency (DARPA) funding and are to compete in the finals event where one team will net the $2 million prize at the end of 2014. The Finals will require robots to attempt a circuit of consecutive physical tasks with degraded communications between the robots and their operators.DARPA said that the 16 teams at this year’s challenge in Miami represented a mix of government, academic and commercial backgrounds. They were not only from the United States, but also from South Korea and Japan. SCHAFT’s high scores were impressive as the DARPA Robotics Challenge (DRC), established to advance state of the art in humanoid robot competition, is considered as a baseline on the current state of robotics. The event is a marker for assessing the evolution of robots in hazardous first-responder environments, a demonstration of what is so far possible in pushing technologies closer to the point where robots will help out in a range of rescue tasks quickly, efficiently and with minimal human interaction.The robots in the latest challenge had to complete eight tasks including climbing up-and-down a ladder, and removing obstacles and debris. Narito Suzuki, COO at SCHAFT, in a video showing the entry, pointed out their bipedal robot’s strength and stability in navigating its way around. More information: www.theroboticschallenge.org/ Atlas teams head for DARPA Robotics Challenge Citation: SCHAFT team tops scores at DARPA Robotics Challenge (2013, December 23) retrieved 18 August 2019 from https://phys.org/news/2013-12-schaft-team-tops-scores-darpa.htmllast_img read more

NASA considers possibilities for manned mission to Venus

first_imgHAVOC. Credit: NASA Langley Research Center Explore further (Phys.org) —NASA’s Systems Analysis and Concepts Directorate has issued a report outlining a possible way for humans to visit Venus, rather than Mars—by hovering in the atmosphere instead of landing on the surface. The hovering vehicle, which they call a High Altitude Venus Operational Concept (HAVOC), would resemble a blimp with solar panels on top, and would allow people to do research just 50 kilometers above the surface of the planet. Most everyone knows that NASA wants to send people to Mars—that planet also gets most of the press. Mars is attractive because it looks more like Earth and is relatively close to us. The surface of Venus on the other hand, though slightly closer, is not so attractive, with temperatures that can melt lead and atmospheric pressure 92 times that of Earth. There’s also that thick carbon dioxide atmosphere with sulfuric acid clouds, lots of earthquakes, volcanoes going off and terrifying lightning bolts. So, why would anyone rather go to Venus than Mars? Because of far lower radiation and much better solar energy.No one wants to go the surface of Venus, at least not anytime soon, instead, researchers at NASA are looking into the possibility of sending people to hover in the sky above the planet, conducting research in a far less dangerous place than even on the surface of Mars. At 50 kilometers up, an HAVOC would experience just one atmosphere of atmospheric pressure and temperatures averaging just 75 degrees Celsius, with radiation levels equivalent to those in Canada. Astronauts on Mars, on the other hand would experience 40 times the amount of radiation typically faced back here on Earth, which suggests they’d have to live deep underground to survive—a problem that scientists have not yet solved. More information: via IEEE Spectrum The one hitch to floating around Venus, would of course be, figuring out how to get both humans and an HAVOC to the planet, and then for getting the humans back home safely to Earth at some point. The initial plans call for a several missions, building up to the final, with space ships first carrying unmanned vehicles to test the concept of an HAVOC, followed by missions where humans would orbit the planet in space. Next, scientists would have to come up with a feasible design for deploying a floating vehicle able to unfurl, fill itself with gas, and hover for long stretches of time in the sky above the planet. After that, vehicles would have to be designed to work with such a craft, to serve as a ferry between the HAVOC and an orbiting craft, to travel back and forth to Earth, and perhaps between a craft that orbits Earth and the surface. A lot of work, no doubt, but one that seems possible even as more and more space scientists are begining to wonder about the feasibility of sending humans to the surface of Mars.center_img Citation: NASA considers possibilities for manned mission to Venus (2014, December 18) retrieved 18 August 2019 from https://phys.org/news/2014-12-nasa-possibilities-mission-venus.html © 2014 Phys.org Researcher studies possibility of metal snow on Venus This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

Magic states offer surprisingly low error rates for quantum computing

first_img Study finds weird magic ingredient for quantum computing (Phys.org)—Quantum computers hold a special allure, as they offer a way to harness quantum phenomena and put it to use to do things that are impossible for ordinary computers. But as powerful as quantum computers could be, they are also delicate in a way, since they must be shielded from the “noise” in the environment that causes detrimental errors. Making quantum computers that are noise-resistant, or fault-tolerant, is one of the biggest challenges facing their development. Citation: Magic states offer surprisingly low error rates for quantum computing (2015, March 13) retrieved 18 August 2019 from https://phys.org/news/2015-03-magic-states-surprisingly-error-quantum.html Schematic of the protocol for encoding a magic state into the surface code with high fidelity. Circles represent data qubits. Credit: Li. (CC-BY-3.0) © 2015 Phys.org More information: Ying Li. “A magic state’s fidelity can be superior to the operations that created it.” New Journal of Physics. DOI: 10.1088/1367-2630/17/2/023037center_img Explore further Currently, the leading approach to fault-tolerant quantum computing involves “magic states.” First proposed in 2005 by Sergey Bravyi and Alexei Kitaev, magic states are quantum states that contain an acceptably low level of error. In order to create magic states, physicists take noisy quantum states and use a process called distillation to derive a smaller number of improved, i.e., higher fidelity, states. This process is repeated as many times as necessary until the states reach the target fidelity.Although distillation works, it is a resource-intensive process that requires the majority of a quantum computer’s hardware. In some cases, up to 90% of a quantum computer’s qubits are needed to create magic states, before any real computing can be done.To address this problem, Ying Li, a physicist at the University of Oxford, has looked for a way to minimize the noise in raw magic states (before any distillation) in order to reduce the number of distillation steps required, and in turn reduce the resource cost. In his work, he made a surprising discovery: raw magic states can have a fidelity that is superior to that of the operations that created them. Li’s protocol takes advantage of the fact that qubits are more sensitive to noise when the code distance (which is related to the number of qubits in a row of a lattice) is small, and more stable when the code distance is larger. After an initial encoding step, the protocol enlarges the code distance in order to reduce the error rate.Even though a large number of operations are required to create a single magic state, Li showed that the infidelity of a raw magic state created by the new method is less than half the infidelity of even a single quantum gate used to create it.The new method could lead to significant advantages for fault-tolerant quantum computing. For one type of magic state, for example, the new protocol can reduce the error due to noise by more than 20 times compared to previous protocols. As a result, the number of raw magic states required can then be reduced by a factor of 15. This improvement translates to fewer distillation steps and a dramatic reduction in the hardware needed for quantum computing tasks. Journal information: New Journal of Physics This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

New insight into nanopatterning diamond

first_imgElectron beam-induced etching on diamond using different ratios of hydrogen and oxygen gases to control the anisotropy. With pure oxygen, the etching is isotropic and no patterns are observed. Adding hydrogen gives rise to anisotropic etching, resulting in patterns. Credit: Bishop et al. ©2018 American Chemical Society More information: James Bishop et al. “Deterministic Nanopatterning of Diamond Using Electron Beams.” ACS Nano. DOI: 10.1021/acsnano.8b00354 This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. © 2018 Phys.org The ability to etch nanostructures onto the surface of diamond is expected to have a wide variety of potential applications, but so far etching and patterning diamond at the nanoscale has been challenging, as diamond is highly chemically inert (unreactive). In a new study, researchers have investigated a technique in which an electron beam is used for nanopatterning diamond, with the results offering new insight into emerging nanofabrication processes. The researchers, James Bishop et al., at the University of Technology Sydney in Sydney, Australia, have published a paper on the nanopatterning and etching of diamond in a recent issue of ACS Nano.In their work, the researchers investigated a technique called gas-mediated electron beam-induced etching. The method requires no mask or resist layer and uses electron beam irradiation in the presence of reactive gases to directly etch diamond and other materials with a spatial resolution as high as 10 nanometers. It also avoids the residual damage issues associated with physical etch techniques such as focused ion beam or reactive ion etching, enabling etching with minimal damage to the underlying material. So far, most work using this method has demonstrated etching that appears uniform, or isotropic. However, in order to create desired patterns or selectively expose certain crystal planes, it becomes necessary to etch selectively in different orientations, which is called anisotropic etching.Using a combination of experimental and computational techniques, the researchers found that oxygen and hydrogen gases play different roles in the etching process. In particular, oxygen causes rapid, efficient and isotropic etching, while the addition of hydrogen slows down the rate of etching of certain crystal planes more than others, enabling anisotropic etching. Anisotropic etching has long been used with other materials such as silicon and gallium nitride in order to create micro/nano-structures with near perfect symmetry and ultra-smooth crystal planes. This new work highlights a method to potentially achieve similar results with diamond. The researchers found that, as more hydrogen gas is added to the system, patterns emerge whose features are aligned with the crystal directions of the diamond lattice. The scientists explain that these patterns are caused by hydrogen’s preferential passivation of certain crystal planes over others. The researchers also showed that it’s possible to control the anisotropy by controlling the amount of hydrogen, and consequently, to manipulate the geometries of the surface patterns. This enabled the researchers to create a detailed model of the etch kinetics, which should simplify future dry etch nanofabrication processes for diamond and enable fabrication of previously untenable structures.”The most significant outcome of the work is the control over etch anisotropy that it enables,” Bishop told Phys.org. “Isotopic etching is useful for etching arbitrarily shaped structures. Anisotropic etching is useful for creating structures with ultra-smooth surfaces and near-perfect symmetries defined by the kinetics of the anisotropic etch reaction. With electron beam-induced etching using oxygen we can obtain high rate isotropic etching, and by mixing in hydrogen, achieve highly anisotropic etching of diamond.”The ability to controllably etch nanopatterns and selectively expose and smooth certain crystal planes on the surface of diamond has a wide variety of potential applications. Different nanopatterns and nanostructures can, for example, expedite neuron growth on diamond surfaces for biosensing applications, as well as enhance light extraction for photonic applications. Diamond is also being investigated for its possible applications for high-power electronics, electrochemistry, and catalysis, all of which may benefit from a simple, high-resolution nanopatterning method.center_img Citation: New insight into nanopatterning diamond (2018, March 2) retrieved 18 August 2019 from https://phys.org/news/2018-03-insight-nanopatterning-diamond.html Journal information: ACS Nano New etch process developed at the CNST uses argon pulsing to improve silicon etch rate and selectivity Explore furtherlast_img read more

Astronomers detect lowmass brown dwarf around Atype mainsequence star

first_img This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. © 2018 Science X Network Brown dwarf in a dynamical-tide regime detected by WASP survey Brown dwarfs are intermediate objects between planets and stars. Astronomers generally agree that they are substellar objects occupying the mass range between 13 and 80 Jupiter masses. Although many brown dwarves have been detected to date, such objects existing as companions of other stars are a rare find.Now, a team of researchers led by George Zhou of Harvard-Smithsonian Center for Astrophysics (CfA) in Cambridge, Massachusetts, reports the finding of a new example of a rare brown dwarf companion orbiting an A-type main-sequence (or A dwarf) star known as HATS-70. The astronomers employed the Hungarian-made Automated Telescope Network-South (HATSouth) to observe HATS-70 and found that this star is transited by some object.Next, the team conducted follow-up photometric and spectroscopic observations of HATS-70 in order to uncover basic parameters of the companion and its host.”We report the discovery of HATS-70b, a transiting brown dwarf at the deuterium burning limit. (…) The transits were first identified by the HATSouth network (Bakos et al. 2013), and confirmed via photometric and spectroscopic follow-up observations that measured the radius and mass of the companion,” the researchers wrote in the paper.According to the study, HATS-70b is about 38 percent larger than Jupiter, has a mass of some 12.9 Jupiter masses and an equilibrium temperature of 2,370 K. The brown dwarf resides in a close-in orbit, separated from the host by only 0.036 AU, with an orbital period of 1.89 days.With an age of about 810 million years, the parent star is nearly two times larger than the sun and has a mass of around 1.78 solar masses. The host has an effective temperature of 7,930 K and a luminosity of approximately 12 solar luminosities. The HATS-70 system is located around 4,260 light years away from the sun.The researchers emphasized that HATS-70b is the first detected brown dwarf companion transiting an A dwarf star. Furthermore, HATS-70b is one of few massive planet or brown dwarf systems with a measured obliquity. The study found that this object, like previously studied massive planets and brown dwarfs, orbits in a low projected-obliquity orbit with 8.9 degrees. Such a low obliquity, when it comes to massive objects, still baffles astronomers.”The low obliquities of these systems is surprising given all brown dwarf and massive planets with obliquities measured orbit stars hotter than the Kraft break. This trend is tentatively inconsistent with dynamically chaotic migration for systems with massive companions, though the stronger tidal influence of these companions makes it difficult to draw conclusions on the primordial obliquity distribution of this population,” the authors of the paper wrote.They added that finding more high-mass companions with obliquities measured, around hot stars could be crucial for determining the origins of close-in orbit brown dwarfs like HATS-70b. Using HATSouth Exoplanet Survey, an international group of astronomers has discovered a low-mass brown dwarf transiting an A-type main-sequence star. The newly detected brown dwarf, designated HATS-70b, is the first such object found around a star of this type. The finding is detailed in a paper published November 16 on arXiv.org. More information: G. Zhou et al. HATS-70b: A 13 Mjup brown dwarf transiting an A star. arXiv:1811.06925 [astro-ph.EP]. arxiv.org/abs/1811.06925 Explore further The unbinned r band HATSouth discovery light curve of HATS-70. Credit: Zhou et al., 2018. Citation: Astronomers detect low-mass brown dwarf around A-type main-sequence star (2018, December 3) retrieved 18 August 2019 from https://phys.org/news/2018-12-astronomers-low-mass-brown-dwarf-a-type.htmllast_img read more

Graphenebased ink may lead to printable energy storage devices

first_img(Top) The salt-templated process of synthesizing graphene nanosheets into ink. (Bottom) The ink and printed demonstration. Credit: Wei et al. ©2019 American Chemical Society Journal information: ACS Nano The researchers, led by Jingyu Sun and Zhongfan Liu at Soochow University and the Beijing Graphene Institute, and Ya-yun Li at Shenzhen University, have published a paper on their work in a recent issue of ACS Nano.”Our work realizes the scalable and green synthesis of nitrogen-doped graphene nanosheets on a salt template by direct chemical vapor deposition,” Sun told Phys.org. “This allows us to further explore thus-derived inks in the field of printable energy storage.”As the scientists explain, a key goal in graphene research is the mass production of graphene with high quality and at low cost. Energy-storage applications typically require graphene in powder form, but so far production methods have resulted in powders with a large number of structural defects and chemical impurities, as well as nonuniform layer thickness. This has made it difficult to prepare high-quality graphene inks.In the new paper, the researchers have demonstrated a new method for preparing graphene inks that overcomes these challenges. The method involves growing nitrogen-doped graphene nanosheets over NaCl crystals using direct chemical vapor deposition, which causes molecular fragments of nitrogen and carbon to diffuse on the surface of the NaCl crystals. The researchers chose NaCl due to its natural abundance and low cost, as well as its water solubility. To remove the NaCl, the coated crystals are submerged in water, which causes the NaCl to dissolve and leave behind pure nitrogen-doped graphene cages. In the final step, treating the cages with ultrasound transforms the cages into 2-D nanosheets, each about 5-7 graphite layers thick.The resulting nitrogen-doped graphene nanosheets have relatively few defects and an ideal size (about 5 micrometers in side length) for printing, as larger flakes can block the nozzle. To demonstrate the nanosheets’ effectiveness, the researchers printed a wide variety of 3-D structures using inks based on the graphene sheets. Among their demonstrations, the researchers used the ink as a conductive additive for an electrode material (vanadium nitride) and used the composite ink to print flexible electrodes for supercapacitors with high power density and good cyclic stability. In a second demonstration, the researchers created a composite ink made of the graphene sheets along with binder material (polypropylene) for printing interlayers for Li−S batteries. Compared to batteries with separators made only of the conventional material, those made with the composite material exhibited enhanced conductivity, leading to an overall improvement in battery performance.”In the future, we plan to exploit the direct chemical vapor deposition technique for the mass production of high-quality graphene powders toward emerging printable energy storage applications,” Sun said. Citation: Graphene-based ink may lead to printable energy storage devices (2019, June 19) retrieved 18 August 2019 from https://phys.org/news/2019-06-graphene-based-ink-printable-energy-storage.html Research team develops cost-effective technique for mass production of high-quality graphene © 2019 Science X Networkcenter_img Explore further More information: Nan Wei et al. “Scalable Salt-Templated Synthesis of Nitrogen-Doped Graphene Nanosheets toward Printable Energy Storage.” ACS Nano. DOI: 10.1021/acsnano.9b03157 Researchers have created an ink made of graphene nanosheets, and demonstrated that the ink can be used to print 3-D structures. As the graphene-based ink can be mass-produced in an inexpensive and environmentally friendly manner, the new methods pave the way toward developing a wide variety of printable energy storage devices. This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

Curtains down

first_imgWills India Fashion Week – Spring Summer 2015 came to close on 12 October. The five-day fashion extravaganza in the Capital witnessed the who’s who of the fashion industry rubbing shoulders with fashion lovers and enthusiasts at Pragati Maidan as more than 55 designers showcased their collection on the ramp and 124 had stalls for buyers at the venue. Day-5 started off with Niharika Pandey, Niket Mishra and Rahul Singh making way for Joy Mitra, Neeta Bhargava, Rehane, Nida Mahmood, Kanika Saluja (Annaikka), Roopa Pemmaraju and Rohit Bal bringing the curtains down with his grand finale collection Gulbagh at Quli Khan’s Tomb. Here’s a look at the day’s offerings…last_img read more