• GMAT

    • TOEFL
    • IELTS
    • GRE
    • GMAT
    • 在线课堂
  • 首页
  • 练习
    我的练习
  • 模考
  • 题库
  • 提分课程
  • 备考资讯
  • 满分主讲
  • APP
  • 我的GMAT
    我的班课 我的1V1 练习记录 活动中心
登录

GMAT考满分·题库

收录题目9362道

按指定内容搜索

热门材料:
GWD PREP07 Test 1 OG12 PREP07 Test 2 PREP2012 OG15 OG16 OG17 PREP08 Test 1 PREP08 Test 2 GWD-TN24 Manhattan Magoosh OG18 OG18-数学分册 OG18-语文分册 GWD-TN24-NEW PREP-NEW OG17-语文分册 OG18-Diagnostic Test MSR TA GI TPA 数论 代数 应用题 几何 排列组合 KMFRC OG19 KMFSC OG19-语文分册 KMFCR KMFPS KMFDS 634 OG19-数学分册 OG20 OG20-语文分册 OG20-数学分册 Ready4 201993测试 20199931 2019931 llk93 音频解析 - OG20逻辑 音频解析 - OG20语法 数学51分真题带练团 精选官方700+新题训练 #OG12-19已排重 #OG20综合 #PREP07 Test1 #PREP07 Test2 #PREP2012 #PREP08 Test 1 #PREP08 Test 2 OG20综合-Verbal OG20综合-Quant #OG20语文分册 OG20分册-Verbal #OG18语文分册 #OG18数学分册 #OG19语文分册 #OG19数学分册 #OG19 #OG18 #OG17 #OG16 #OG15 #OG12 IR-OG17 IR-OG18 AWA-OG15 AWA-OG16 AWA-OG17 #300难题 DAY1练习码 DAY2练习码 DAY3练习码 DAY4练习码 OG20语法单科 300难题-SC 300难题-CR 300难题-RC 300难题-PS 300难题-DS 2.5阅读刷题营 2.6阅读刷题营 2.7阅读刷题营 3.4GMAT逻辑活动 OG21 模考带练机经题 OG21-PS OG21-SC OG21-CR 热搜题目精选 181215 190113 190124 190207 190215 190302 190310 190321 190407 190415 190603 191020 191031 191222 200301 还原机经选题: 数论&代数 还原机经选题: 文字题&几何 OG2022

搜索结果共2476条

来源 题目内容
Ready4

     The release of accumulated stress at a particular point during an earthquake alters the shear and normal stresses along the tectonic plate boundary and surrounding fault lines. According to geophysical theory, the Coulomb Failure Stress Change, which is the estimated alteration and resultant transfer of shear and normal stresses along a plate margin, is a function of change in shear stress along a fault and secondarily the change in normal stress along a fault and the change in pore pressure in the fault zone, the latter two factors scaling according to the friction coefficient characteristic to the plate margin.     By measuring and calculating the stress transfer following seismic activity, it is possible to subsequently construct basic contour maps of regions where there have been positive stress changes and are therefore of higher risk of being potential epicenters of future large earthquakes. Calculations have revealed that when an earthquake occurs, approximately 80% of the energy is released as seismic waves, whereas the remaining 20% is stored and transferred to different locations along the fault, making those specific regions more susceptible to future earthquakes.     Predicting earthquakes by using the theory of stress transfer has important potential applications. The main rival technique for forecasting, the statistical analysis of patterns in seismic activity, suffers from a contradiction. Foreshocks are deemed evidence of the potential for a future high-magnitude earthquake, but the lack of foreshocks along faults known to be active has been considered an equally plausible potential precursor for large events.     The stress transfer theory has been used to predict the location of a magnitude 7.4 earthquake that occurred two years later in the port city of Izmit, Turkey, killing more than 30,000 people. A limitation of the theory as currently applied, due to insufficient understanding of plate kinematics, is that refining predictions with temporal constraints appears to be far more problematic; the team that gave the Izmit prediction had forecast an event near the city within thirty years.    

Ready4

     The controversial Canadian media intellectual Marshall McLuhan first began to garner public attention with his book The Mechanical Bridge in 1951, precisely during the time when North America was first gripped by and attempting to come to grips with the influence of television programming and advertising on society. One of McLuhan's core theses was that every communication medium, including the television, has inherent effects apart from those that any artist or businessperson willfully creates through it and that these effects are not always positive.      McLuhan achieved the height of public attention in part by emulating the advertisers he studied, inventing memorable phrases to convey his points (such as "the medium is the message," "turn on, tune in, drop out," and "global village"). Arguably, however, he never expected or even hoped to deflect substantially the tide of the technological and social forces in play at the time. He likened the successful reader of his works to the sailor in Edgar Allan Poe's story "A Descent into the Maelstrom," who saves himself by studying a whirlpool and by moving with, not against, its current.      The media thinker's legacy is in equal parts inevitable and inconsequential. The advent of the internet, which he had predicted thirty years prior, and of subsequent technologies would force society to broaden its perspective of media channels and examine their impact more closely. On the other hand, in the present milieu, where media professionals and advertisers tend to speak of "channels" and "content" as well-defined and non-overlapping components of communication, McLuhan's primary message appears to been lost among all the new mediums.

Ready4

     One strain of historical thought that achieved popularity in the 1950s forwarded the notion that immigration - more than the frontier experience, or any other specific event or factor - had been and continued to be the defining element of United States history. In this depiction, the 30 million immigrants who entered the country between 1820 and 1900 had common experiences regardless of their national, religion, or race: namely, in experiencing hardship and alienation, they themselves changed, but they also carried on the development of the nation itself.      Both casual and formal students of history should, however, be careful in equating the experiences of different groups of immigrants, especially under the somewhat blurring concept of "hardship." The description that all immigrants experienced hardship and immigration fails to account properly for the fact that in the 17th and 18th century millions of Africans were forcibly shipped to the United States and sold into slavery. While this group of people should not be excluded from any full reckoning of the nation's migrants, its alienation and hardship was of a substantially different character from that of the other populations, who migrated more willingly and independently and who arrived under and lived in vastly different conditions. If it is, indeed, the degree of hardship and alienation experienced by the different groups of our nation's migrants that have above all shaped both themselves and their nation, then to ignore these distinctions would be to distort an important element of what our nation has been shaped to be.    

Ready4

     A myth in the ongoing debate about minimum wages is that raising minimum wages will necessarily increase a country's unemployment rate. While there are cases in which a marginal increase in wage rates might impact the operations of a company dramatically enough for the company to change its operations, in most companies, the cost increases of higher wages will tend to affect a company's bottom line without altering its staffing structure. For example, if a particular fast-food location operates at a particular time window with a staff of five people, then five must be the minimum staffing level for that business to achieve optimal results. In the case of a national fast-food chain, especially, these operational questions in general will already have been optimized. Even before rates are raised, managers of these locations have asked themselves whether they can afford to cut jobs and whether they are staffed at optimal levels (in this case, five people). A more specific calculation is needed. In this example, the precise question is how a marginal increase in staffing costs would compare to the decrease in business that would result in decreasing the staff level from five to four and serving food less quickly. The results of this analysis would not necessarily be consistent across industries, or even across markets and companies within an industry.    

Ready4 After suffering a countercyclical decline in the prosperous 1920s, the American labor movement grew in fits and explosive starts during the Great Depression and had finally come of age by 1940. Initially, the sole major organizational vehicle organizing unionized workers in the nation was the American Federation of Labor (AFL), which, carrying on the legacy of Samuel Gompers, was reluctant to take steps that might instigate government counteraction, and which therefore failed to exploit completely the growing unrest of American workers. Some AFL leaders, such as John L. Lewis of the Miners, had more aggressive views to push unionization into industries it had not yet substantially occupied, but these individuals were the exception in the organization, not the norm. A shift was precipitated by an act of legislature, the National Industrial Recovery Act of 1932, whose clause 7a guaranteed workers a choice of union and compelled employers to deal with those unions. The clause unleashed a wave of unionization, both spontaneous and driven by the AFL. Membership in the AFL surged, but the organization's conservatism--its orientation toward skilled labor, in particular--left it ill-equipped to organize and harness the energy of the mass of relatively unskilled workers clamoring to join the movement. In light of this roadblock (which actually caused the AFL to lose members), John L. Lewis formed the Committee for Industrial Organization (CIO) of ten of the more aggressive AFL unions. Within a year of its inception, the member unions of the CIO were expelled from the AFL and stood in the CIO as a distinct organization, at that point newly unhampered it its aggressive and sometimes violent pursuit of membership from unskilled labor. The CIO's tactics were effective, so that, by the time the AFL and the CIO reconciled in 1940, total union membership had risen to 8.5 million, from barely 3 million in 1929.
Ready4

     A prototypical nanoparticle is produced by chemical synthesis, then coated with polymers, drugs, fluorophores, peptides, proteins, or oligonucleotides, and eventually administered into cell cultures. Nanoparticles were conceived of as benign carriers, but multiple studies have demonstrated that their design influences cell uptake, gene expression, and toxicity.      More specifically, interactions between nanoparticle-bound ligands (the molecules that bind to nanoparticles) and cellular receptors depend on the engineered geometry and the ligand density of a nanomaterial. The nanoparticle acts as a scaffold whose design dictates the number of ligands that interact with the receptor target. A multivalent effect occurs when multiple ligands on the nanoparticle interact with multiple receptors on the cell. The binding strength of complexed ligands is more than the sum of individual affinities; the accumulated effect of multiple affinities is known as the avidity for the entire complex.      This phenomenon is illustrated by the binding affinity of the antibody trastuzumab to the ErbB2 receptor, a protein whose overexpression has been shown to play an important role in the development and progression of certain aggressive types of breast cancer. Trastuzumab's binding affinity to ErbB2 when liganded to a nanoparticle increases proportionally with the size of a nanoparticle, owing to a higher density of the ligand on the nanoparticle surface. However, when viewed in terms of the downstream signaling via the ErbB2 receptor, mid-sized gold nanoparticles induced the strongest effect, suggesting that factors beyond binding affinity must be considered.      Furthermore, several studies have shown that nanoparticle design can generate effects not obtained simply by a free ligand in solution. For example, the aforementioned mid-sized trastuzumab-coated gold nanoparticles altered cellular apoptosis, the process of programmed cell death, by influencing the activation of the so-called "executioner" caspase enzymes. Similarly, receptor-specific peptides improved their ability to induce angiogenesis, the physiological process through which new blood vessels form from preexisting vessels, when these peptides are conjugated to a nanoparticle surface. Specifically, presentation of the peptide on a structured scaffold increased angiogenesis, which is dependent on receptor-mediated signaling. These findings highlight the advantages of a ligand bound to a nanoparticle over one free in solution. The nanoparticle surface creates a region of highly concentrated ligand, which increases avidity and, potentially, alters cell signaling.

Ready4

     In 1905, the Supreme Court of the United States decided on the case Lochner v. New York, and in doing so overturned the Bakeshop Act, which limited the number of hours that a baker could work per day to ten. The Court ruled that the Act removed a person's right to enter freely into contracts, which it construed as provided for by the Fourteenth Amendment. The Court had previously determined through multiple rulings that the Due Process Clause, found in both the Fifth and Fourteenth Amendments, was not merely a procedural guarantee, but also a substantive limitation on the type of control the government may exercise over individuals. Lochner set a precedent against the established federal and state laws regulating working hours and wages. For example, in Adkins v. Children's Hospital, in 1923, the Court ruled that federal minimum wage legislation for women was an unconstitutional infringement of liberty of contract, as protected by due process.      Some subsequent development of human rights evolved on the basis of Lochner; for example, Adkins was a significant point in the women's rights movement in the U.S., as the legislature and justice department for decades debated whether to establish absolute equality of women or provide only special protections and regulations for them. Nevertheless, the Court overturned Adkins and undermined Lochner in deciding West Coast Hotel v. Parrish, in 1937. That ruling repudiated the idea that freedom of contract should be unrestricted and echoed, after the fact, the dissenting opinion of Justice Holmes in Adkins that there were plenty of constraints on contract, such as that against usury. At the time of West Coast Hotel, whose outcome hinged on an unexpected shift in the habits of Associate Justice Roberts, the dissenting Justice Sutherland was critical of the prospect that the interpretation of the Constitution reflected in the decision had been colored by contemporary events-ostensibly, the pressures placed upon workers by the circumstances of the ongoing Great Depression. Time has evidently judged this criticism to have been incorrect, since, while Lochner influenced a ruling whose imprint still remains, individual freedom of contract is not exempt from reasonable laws to protect worker health and safety.    

Ready4

     The diversity of species in bacterial communities is often studied by phenotypic characterization. A problem with this method is that phenotypic methods can be used only on bacteria which can be isolated and cultured, and most soil bacteria that have been observed by fluorescence microscope cannot be isolated and cultured.     DNA can be isolated from bacteria in soil to obtain genetic information about the nonculturable bacteria therein. The heterogeneity of this DNA is a measure of the total number of genetically different bacteria, or the number of species. DNA heterogeneity can be determined by thermal denaturation and reassociation. In general, renaturation of homologous single-stranded DNA follows second-order reaction kinetics. In other words, the fraction of DNA that has renatured within a given time period is proportional to the genome size or the complexity of DNA, defined as the number of nucleotides in the DNA of a haploid cell, without repetitive DNA. The genetic diversity of a bacterial community can be inferred in a similar manner.     Vigdis Torsvik, Jostein Goksyr, and Frida Lise Daae used this process to analyze soil samples taken from the soil from a beech forest north of Bergen, Norway. The reassociation curves for the main DNA fraction did not follow ideal second-order reaction kinetics, so the half-life values gave only approximate, underestimated values for the number of genomes present. Nevertheless, the soil bacterium DNA was very heterogeneous; the diversity corresponded to about 4,000 distinct genomes of a size typical of standard soil bacteria. This diversity was about 200 times as many species as could have been isolated and cultured.     Various procedures for isolating DNA from river sediments and seawater are known. This opens up the possibility of applying the thermal denaturation method to systems other than soil. The results of the Norway study indicated that the genetic diversity of the total bacterial community in a deciduous-forest soil is so high that heterogeneity can be determined only approximately. In environments with pollution or extreme conditions, the genetic diversity might be easier to determine precisely.  

Ready4

     Two historians of the First World War both depict women as taking up roles previously reserved for men, but they differ slightly in the significance they ascribe to these unprecedented but temporary wartime duties. Gail Braybon describes the war as a liberating experience for many women. Although women working in munitions factories were subject to new dangers, such as explosions and trinitrotoluene poisoning, they were mindful of and proud of supporting the war effort, whether or not they considered the broader significance of their actions. Joshua Goldstein too describes a sense of freedom in women but emphasizes that it was short-lived. Although the war bent gender roles, it did not lessen hostility to women in traditionally male jobs, increase compensation for female labor, or uproot the notion that home life was a strictly female responsibility. Braybon might reply by noting that, while other changes were slower in coming, some women suffragists supported the war and women's role in it to further their cause, and this may indeed have contributed to the advent of women's right to vote after the war, even by Goldstein's account. Perhaps more central to Braybon's position is that the liberation that women experienced during the war was one of sentiment and therefore made no less real by the lack of accompanying widespread reform. Furthermore, even though the spirit of liberation must have faded with the end of the war, it might have lived on in a latent form and ultimately contributed to the formation of the women's movement.    

Ready4

     The study of climate change has established retreating glaciers and rising global temperatures from a number of data sources. Establishing the influence of mankind upon these effects has been more difficult, because the climate is subject to oscillations that are much longer in duration than our record of direct temperature measurements, which extends back only about 150 years.      By drilling and conducting chemical and physical studies of ice cores on six of the seven continents, scientists have developed a method of estimating climatic information that had previously been thought inaccessible. Ice cores removed from the earth's crust and studied in order to draw such inferences are termed paleo-proxies. The values of various climatic variables at a particular time and place can be inferred through some form of proxy analysis in a given ice core sample. For example, deuterium excess indicates humidity levels, electrical conductivity indicates volcanic activity, beryllium levels indicate solar activity, and particle size and concentration indicate wind speeds. Temperature, in particular, can be inferred from the ratios of water molecules composed of stable isotopes of oxygen and hydrogen, namely 1H2H16O and 1H218O. Because molecules consisting of these isotopes have slightly different weights than their more common counterparts, their concentration in the ice core in a given epoch depends on the condensation temperature prevailing at the time. This technique enables scientists to estimate the air temperature of condensation when the snow fell and establish variations in temperature over a series of multiple samples.      One advantage of using ice cores as a paleo-proxy is that ice core samples can be extracted from across the world using different drilling techniques, for analysis either on-site or in a laboratory, with results that can be compared to each other and stitched into a coherent global picture. The primary sources of ice cores are the ice sheets of Antarctica and Greenland, whose thickness allows scientists to extract long cores representing time spans of up to 100,000 or even 400,000 years. Nevertheless, samples representing spans of multiple centuries have been extracted more recently at low latitudes--for example, at Mt. Kilimanjaro, in the Andes Mountains, and on the Himalayan plateau. Depending on the objectives of the project and the nature of the ice core, scientists use a variety of types of drill ranging from hand-powered auger drill to electro-mechanical drills. A limitation of using ice cores is that they represent data for conditions during snowfall only. Periods bereft of snowfall will fail to leave a record in the ice and can even disrupt the essential step of dating the samples. To mitigate this problem, multiple cores are typically extracted from nearby locations. A more critical limitation of the ice core method, one indicative of the larger problem at hand, is that as ice fields continue to retreat, the ability to measure in some locations will disappear entirely.  

Ready4

     It has been estimated that over 20% of the annual gross domestic product in the United States is the result of innovation backed at some point by venture capital investors. But what is innovation? One traditional view of innovation is that it is a systematic business process occurring within an organization required to secure ongoing financial growth. But much of the most acclaimed and influential innovation has started with an individual's idea and only somewhat later followed with an organization to execute on that idea, so the organizational definition is of limited relevance.      A more practical definition of innovation is that it is the creation of anything new intended to be commercialized. Under such a definition, the efforts of a lone individual developing a radical idea and those of a department within a large company to explore a new adjacent market are both examples of innovation. This somewhat loose definition, however, fails to address explicitly what makes an innovation truly new, successful, or authentic, although it may imply that all innovation is equally valid in a sense. Otherwise, the oft-repeated challenge to uses of the term innovation may put too little emphasis on the activity and too much on its results. Quite possibly, 80% of the value of innovation has been contributed by 20% of the activity, but whether that 20% of activity could have manifested itself without a culture and economy to support the whole is less clear. In this regard, policy- and strategy-oriented attempts to refine this loose definition of innovation further are without merit.    

Ready4

      Despite views that globalization has reached its peak, a period beginning in the nineteenth century and extending into the early-twentieth century, in fact, is the interval during which international barriers to trade fell most steeply, as can be seen in the case of price convergence in commodities. The prices of cloves, pepper, and coffee failed to converge between Amsterdam and East Asia or between England and India from as far back as 1580 but began in 1820 to draw closer. Similarly, the difference in wheat prices in the United States and England fell from one hundred percent in the early-nineteenth century to negligible levels late in the century and to no difference at all in the early-twentieth century. A similar story unfolded during this period for bacon, cotton, and rice.

     Peter Lindert and Jeffrey Williamson have summarized the price gaps in commodity markets between continents as evolving in three phases. From 1820 to 1914, these gaps fell by 81 percent; they attribute 72 percent of this decline to cheaper transport and 28 percent to trade policies. Second, during the wartime period of 1914 to 1950, the gaps doubled, due to a reversal in trade policies. Finally, from 1950 to 2000, they fell again by 76 percent, ending up 92 percent lower than in 1820, with about four-fifths of the total change attributable to cheaper transport and one-fifth to more favorable trade policies.

     Trade data for this period are typically articulated in terms of ratio of total trade volume to gross domestic product, since commodity price information is not universally available. These ratios in many advanced economies were higher in the mid-1990s than in the early 1900s, but not by much. In Japan, notably, the percentage of GDP for which trade accounted in 1995 was 17 percent, far under its 1910 level of 30 percent, as measured in current prices. Sure enough, the ratios have risen somewhat in other economies over that same time period-by 13 percentage points in the United Kingdom, 8 points in France, and from 11 percent to 24 percent in the United States; this latter spike may explain why the attention to globalization has been especially acute in America. These increases, nevertheless, are modest given the fact that the world economy grew roughly twice as quickly in the twentieth century as in the nineteenth.

     The ratios above, and hence the substantiation for the argument that globalization peaked a century ago, grow much more dramatically if they are computed in constant prices rather than in current prices, because the prices of goods relative to services fell due to sustained increases in productivity in the sectors producing these goods. Trade has grown most in those sectors in which prices have most strikingly fallen, so the proportions of GDP in constant prices have risen more than those in current prices.

Ready4

With globalization, "sustainable development" is concerned not only with economic development but also with the development and resolution of social and environmental problems. Despite different interpretations, the number of companies in the world that are realizing the economic benefits of CSR (Corporate Social Responsibility) policies has been on the rise in the past decade. CSR commitments go beyond the desire for higher profit and show that businesses are fully aware of their responsibilities to employees, customers, community and the environment. Many companies use CSR as a new business approach when they realize that it can help enhance the role of managers, improve financial situations, strengthen the motivation of staff, inspire customer loyalty, and bolster corporate reputation in society.

The challenges facing world companies from the application of CSR are clear. The CSR awareness of a company may be considered an index proportional to the success of that company on the path of development. Multinational corporations or powerful companies apply a set of rules outlining responsibilities or proper practices (called a code of conduct or code of ethics) and standards like SA8000, WRAP, ISO 14000 and GRI, and regard them as their commitments to the world. Titans are paying highly for an ideal business model with a highly competitive system, sustainable development and more social responsibility. For example, Best Buy - the international retailer of consumer electronics and entertainment software - is famous for applying a product recycling program. Starbucks has shared hands in many community activities. The world`s largest Internet search provider, Google, treats its employees as "gold." In addition to guaranteeing the quality of life of employees, protecting the environment, and developing products that benefit both consumers and the environment, companies also set up funds and donate to charity to contribute to the development of the society and community. Oil group Royal Dutch Shell established charity foundations, including the Early Learning Centre in South Africa to educate children and teach skills for adults. The World Bank (WB) and pharmaceutical company Merck launched an initiative to develop a $50 million foundation that includes donating Mectizan products to help 28 African countries cure diseases. Billionaires like Bill Gates and Warren Buffett and their companies` foundations contribute significantly to the eradication of disease through their donations.

In developed countries, social responsibility expenses used for research and development are more than those for charity purposes although philanthropic money is not insignificant. In conclusion, corporate social responsibility is playing an increasingly important role in fostering sustainable practices and economic development as well as the observance of laws and ethics throughout the world.

Ready4

Largely unrecognized during his lifetime, Blake is now considered a seminal figure in the history of both the poetry and visual arts of the Romantic Age. All around him, Blake saw tendencies to attempt to restrict human capacity and the freedom of the imagination, all of which strengthened his commitment to man`s basic right to social, political, religious and sexual equality. Along with the Romantic poets, Blake sought out a spiritual truth, a truth that could only be achieved by the use of feelings and the imagination. He believed that man originated from a spiritual realm, and was born as a free spirit, but that as a result of the deeply negative influence from the earth itself, man became trapped in the confines of his physical body and the five senses, which limited his capacity for perception. The only way to be freed from this confinement was by what he called "Imagination," the capacity to apprehend realities beyond the prison of the physical world.

When considering the totality of Blake`s work, it has been suggested that he worked on two levels; a commercial or worldly level, and a spiritual level. On the one hand he worked as a commercial engraver, undertaking engraving and printing jobs for his customers. On the other hand he produced his personal, spiritual work, and it was this part of Blake`s work that inspired his development of illuminated printing and prophetic books, written "so that the spirits could see them." The poems in Blake`s Prophetic Books were exciting texts of a mystical nature, and very few people could understand them, not even Blake`s closest friends. Today they are considered to be masterpieces, expressing Blake`s belief in a spiritual world and his hope that man can overcome all limitations by means of the spirit within himself.

Blake was above all a great religious thinker who used both art and words to express mental and spiritual truths that had been forgotten by an increasingly materialistic society. Blake`s great achievement was to uphold the image of the spiritual man in a world dominated by material forces. This he was able to do because he felt himself part of a vast spiritual world, a world that had been revealed to him in his visions. Some regarded him as not clearly in his right mind, but others were convinced that what he saw was true. It is a question of what kind of world you believed in. And to Blake, reality was spirituality.

Ready4

One of the most asked questions has always been: How was the universe created? Many once believed that the universe had no beginning or end and was truly infinite. With the advent of the Big Bang theory, however , no longer could the universe be confidently considered infinite. The universe was forced to take on the properties of a finite phenomenon, possessing a history and a beginning. However, over the decades, there have been multiple interpretations of the Big Bang. In the standard interpretation of the Big Bang, which took shape in the 1960s, the formative event was not an explosion that occurred at some point in space and time - it was an explosion of space and time. In this view, time did not exist beforehand. Even for many researchers in the field, this was a bitter pill to swallow. It is hard to imagine time just starting: How does a universe decide when it is time to pop into existence?

In 2004 Sean Carroll and a graduate student of his, Jennifer Chen, came up with a much different answer to the problem of "before." In his view time`s arrow, or time`s flow in only one direction, and time`s beginning cannot be treated separately and are in fact cyclical: There is no way to address what came before the Big Bang until we understand why the before precedes the after. "Our universe has been evolving for 13 billion years," Carroll says, "so it clearly did not start in equilibrium." Rather, all the matter, energy, space, and even time in the universe must have started in a state of extraordinarily low entropy. That is the only way we could begin with a Big Bang and end up with the wonderfully diverse cosmos of today. Understand how that happened, Carroll argues, and you will understand the bigger process that brought our universe into being. In Carroll and Chen`s theory, fluctuations in the dark-energy background trigger a crop of pocket universes from empty space which eventually go back to becoming empty spaces.

Yet another theory is put forth by rebel physicist Julian Barbour. According to Barbour in his 1999 book "The End of All Time," all possible configurations of the universe, every possible location of every atom, exist simultaneously. There is no past moment that flows into a future moment; the question of what came before the Big Bang never arises because Barbour`s cosmology has no time. The Big Bang is not an event in the distant past; it is just one special place in a vast universe.

Ready4

By far the most common cause of hair loss in men is androgenetic alopecia, also referred to as "male pattern" or "common" baldness. It is caused by the effects of the male hormone dihydrotestosterone (DHT) on genetically susceptible scalp hair follicles. This sensitivity to DHT is present mainly in hair follicles that reside in the front, top, and crown of the scalp (rather than the back and sides), producing a characteristic and easily identifiable pattern.

It is frequently stated that "hair loss comes from the mother`s side of the family." The truth is that baldness can be inherited from either parent. However, recent research suggests that the reasons for hair loss and balding may be a bit more complex than originally thought. Factors on the X-chromosome have been shown to influence hair loss, making the inheritance from the maternal side of the family slightly more important than the paternal one. The identification of an androgen receptor gene (AR) on the X-chromosome helps to explain why the hair loss pattern of a man resembles his maternal grandfather more often than his father. However, this is clearly not the whole story since a direct inheritance of baldness from the father is observed as well. An autosomal (non-sex) linked gene would explain this type of transmission - but this gene has not yet been found.

DHT is formed by the action of the enzyme 5-alpha reductase on testosterone, the hormone that causes sex characteristics in men. DHT causes male hair loss by shortening the growth, or the anagen, phase of the hair cycle, causing miniaturization of the follicles, and producing progressively shorter, finer hairs. Eventually these hairs totally disappear.

Ready4

Archeology has undoubtedly enriched mankind`s understanding of history more than any other discipline. Archeology has unraveled a great part of man`s unwritten past. Studying the material remains of past human life and activities may not seem important or exciting to the average person when compared to other scientific pursuits like the biological sciences. But archeology`s aim to understand mankind is a noble endeavor that goes well beyond uncovering buried treasures. Knowing what made past cultures cease to exist may indeed provide the key to making sure that history does not repeat itself.

However, when it comes to dating archeological samples, several timescale problems arise. For example, Christian time counts the birth of Christ as the beginning, AD 1 (Anno Domini); everything that occurred before Christ is counted backwards from AD as BC (Before Christ). The Greeks consider the first Olympic Games as the beginning or 776 BC. The Muslims count the Prophet`s departure from Mecca, or the Hegira, as their beginning at AD 662. The Mayan calendar used 3114 BC as their reference. As a result, carbon dating began to emerge as the most popular and accepted form of determining the absolute dating of artifacts and events. However, this technique is not without its limitations or risks.

First, the size of the archeological sample is important. Larger samples are better because purification and distillation remove some matter. Although new techniques for working with very small samples have been developed, like accelerator dating, these are very expensive and still somewhat experimental. Second, because the decay rate is logarithmic, radiocarbon dating has significant upper and lower limits. It is not very accurate for fairly recent deposits. In recent deposits so little decay has occurred that the error factor (the standard deviation) may be larger than the date obtained.

Despite all of this, carbon dating remains a useful tool for archeological research, helping archeologists obtain a much needed global perspective on the timing of major prehistoric events.

Ready4

The origin of life remains one of the world`s greatest mysteries even to this day. Several theories have attempted to explain this phenomenon, but thus far, there has been no conclusive evidence to support one theory over another. One of the most popular theories is the primordial soup theory. This model purports that all life on the planet was created out of a "primordial soup." Oparin and Haldane, the scientists who developed this model, hypothesized that all life forms originated in a warm pond or ocean from a combination of chemicals that created amino acids. All the building blocks of life were already available in the atmosphere. These molecules were then "energized" by lightning, and combined to form amino acids which, in turn, folded into complex proteins. The first organisms to evolve were simple heterotrophs, which survived by consuming other organisms. Theoretically, this is plausible. Miller and Urey, in 1950, carried out a rather famous experiment to test this theory. They mixed gases believed to have been present in the primordial atmosphere and then energized the mixture with an electric spark. They discovered that all 20 amino acids could be produced in this manner.

However, there are several major flaws in this theory. First, the formation of proteins from amino acids is an extremely complex process. There are over a trillion ways amino acids can combine to form proteins, and hitting upon the right protein by accident is extremely improbable. Moreover, the primordial soup would have been way too diluted to achieve anything. In short, there is just too much left to chance. As one critic put it, "What are the chances that a monkey playing with an old typewriter will accidentally type out a bestseller?" An alternative theory is that life originated deep underground, not in a water body. One could argue that "superbugs," which are present in deep soil even today, were the first life forms to evolve. They may have originated as microbes thriving on intense heat. However, there has been no satisfactory account of how these superbugs evolved from a mass of chemical compounds.

Another hypothesis is that life did not originate on the earth at all. The Panspermia theory suggests that life seeds came from outer space. This theory has found considerable support, as it explains the sudden spurt in life forms after millions of years of nothingness. Also, statistics show that 7.5% of rocks from Mars reach earth. Such rocks, from Mars or elsewhere, could have transported bacteria, which can easily survive the harsh environs of space. After all, there is evidence that meteorites contain amino acids and bacteria. Panspermia states that these meteorites carried bacteria, amino acids, methane and carbon. These substances were then deposited on earth, either in warm oceans or underground, and eventually grew into simple life forms. This theory merits further research because it has potential to explain not only the origin of life on earth, but also prove the presence of life beyond the earth.

Ready4

In 1848, Marx, supported by Engels, completed the "Communist Manifesto." The "Communist Manifesto" attempts to explain the goals of Communism as well as the theory underlying the movement. It argues that class struggles, or the exploitation of one class by another, are the motivating force behind all historical developments. Class relationships are defined by an era`s means of production. However, these relationships eventually cease to be compatible with the developing forces of production. At this point, a revolution occurs and a new class emerges as the ruling one. This process represents the "march of history" as driven by larger economic forces. The "Communist Manifesto" stated that all men were born free but that society put the majority of men in chains. Engels referred to the book as being the "very way of life." In a higher phase of ideal communist society, the enslaving subordination of the individual to the division of labor and also the antithesis between mental and physical labor vanish. Labor becomes not only a means of life but life`s prime want and the productive forces increase with the all round development of the individual. All the springs of the cooperative wealth flow more abundantly."From each according to his ability, to each according to his needs!" echoes Marxism. The nonstarters were considered the ones who wanted a free enterprise system advocating maximum productivity. Economic progress in Marxist regions has been sterile because it can imagine no change except through the extension of Marx`s ideology - the argument that the elimination of social classes cannot come about through reforms or changes in government, but rather, a revolution is required. Indeed, some forms of neo-Marxism resemble little more than a giant "conspiracy theory," whereby a capitalist ruling class is able to manipulate other classes in society for their own ends and benefits.

Marxist society greatly undervalues non-economic forces and wraps a great deal of belief in an economic shell at the expense of non-economic issues. In fact, modern industrial society in specific is characterized by class conflict between the bourgeoisie and proletariat. However, the theory that productive forces of capitalism are quickly ceasing to be compatible with this exploitative relationship is greatly over emphasized. More importantly, much Marxism - both old and modern - has tended to ignore the role and position of women in society. Women tend to be marginalized in much Marxist theorizing, possibly because of the focus upon work relationships. This criticism is irrelevant and obsolete in current context. In comparison, capitalism, however, as an economic and political system, has proven to be more durable and flexible than Marx imagined.

Ready4

Over the past few years there has been a lot of discussion about the animation bubble in Canada, and the booms and busts the industry seems to go through every 7 years or so. This last boom, it could be argued, lasted well over a decade. However , such growth is not without substantial risks. The industry has long argued that one reason why the local animation industry has lagged behind its counterparts across the globe is its inability to compete against low cost animation from Asia. The government, in a bid to address this disparity, has enacted a law requiring broadcasters in Canada to acquire 50 percent of their animation content from local animation studios.

The response appears to have been substantial. According to figures, the total animation content procured from local studios rose from $10 million in 2001 to $180 million in 2011, with no letup anticipated in the next decade. The software tools available today, mostly Flash, allow smaller companies to produce animation at a rate that was not conceivable even 15 years ago. Because of this major shift, entire productions, TV series, and features are being produced almost entirely in house using a crew that is almost entirely Canadian - a smart thing to do because it allows the broadcasters to take advantage of the lucrative Canadian and provincial tax credits available for just this kind of production. As promising as this is for local studios, this increased patronage poses dangers for them, too.

First, local studios risk expanding too fast and overextending themselves financially, since most are small concerns unlike their Asian counterparts. They often need to make substantial capital investment in building technology capability in order to handle the larger volume of work. If, thereafter, the content fails to work with the audience and further episodes get cancelled, such firms can face potentially crippling losses.

A second risk is that these studios will end up creating work that is too specific to local audiences. Given the increased dependency on local broadcasters, the studios run the risk of creating content tailor made for local audiences rather than remaining culture neutral. This substantially reduces the ability of their products to sell beyond local boundaries.

Third, a local studio that secures the contract from one large broadcaster often runs the danger of becoming - and remaining - dependent on that single broadcaster. Even in the best of circumstances, fierce competition from Asian studios makes it difficult for small studios to broaden their customer base internationally: When such firms have nearly guaranteed orders from a single local benefactor, complacency from their current success may arise.

  • ‹
  • 1
  • 2
  • ...
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • ...
  • 123
  • 124
  • ›