‘Don’t See Yourself As Evil’ – Sruly Green Song

[OFFICIAL SINGLE] Sruly Green – Bizt A Gita Mentch – די ביזט א גיטע מענטש

Published on Oct 29, 2018

“Du Bizt A Gita Mentch!” These words, were repeated, over and over to singer and composer Sruly Green. One evening while Sruly was working on a different song he was looking to release he started by chance to sing these words as a new song was being formed. That night this new track was created. The song was so good Sruly decided to postpone the original tune he was working on and fast track this awesome new song to be released right away. The song begins with a vort from לקוטי מרן which he was Zoche to hear from Reb Mota Frank when he shared the song with him.

Composed by Sruly, the song was then sent to Ian Freitor, no stranger to the music scene, who created this incredible arrangement with Daniel Kapler, a super duo responsible for many recent hits in the Jewish music world. Sruly turned to Gershy Schwartz for vocals as well as choir members Yitzy Oestreicher, Yossi Greenberg and Zishy Green. The song was mixed and mastered by Ian Freitor and Daniel Kaplar.

Song Lyrics:

כמו כן הוא אצל האדם בעצמו,
שצריך לדין, את עצמו לכף זכות
ולמצוא בעצמו, איזה נקודה טובה
ולמצוא בעצמו, עדיין

כדי לחזק את עצמו
שלא שלא יפול
לגמרי חס ושלום
פריי זיך מיט ווער די ביזט
פריי זיך מיט וואס די טוסט
פארדעק דעם מיסט
אין זעה נאר דיין גוטס

״די ביזט א גיטער מענטש״
״די ביזט א גיטער מענטש״
זאג צי זיך אליין
״איך בין א גיטער מענטש״

כמו כן הוא…

כדי לחזק את עצמו
שלא שלא יפול
לגמרי חס ושלום

ווארף שוין אוועק דעם גילט
נעם נישט אלעס אלס דיין שילד
לאז עס נישט שניידען אין דיר

די ביזט א גיטע מענטש…

יא איך בין א גיטער מענטש
איך בין א גיטער מענטש
איך זאג צי מיר אליין
איך בין א גיטער מענטש

Continue reading…

From YouTube, here.

COVERUP: How the Feds Murdered 50,000 Oldsters via Lethal Flu Shots In 1993

By Bill Sardi
August 17, 2009
NewsWithViews.com

For decades now, since the 1918 Spanish flu epidemic, US life expectancy has progressively risen. But federal government documents reveal a sudden unexplained increase in the US death rate in 1993, so severe as to cause a decline in US life expectancy for the first time in over 8 decades. Examine the chart below (Deaths: Preliminary Data for 2004 — National Center for Health Statistics)

Nearly 93,000 more deaths were reported in 1993 than the previous year. My memory bank didn’t recall any outstanding disease or epidemic back then.

What was the cause of this severe increase in the death rate? I began to investigate.

Not caused by a non-infectious disease

The Monthly Vital Statistics Report said death rates for HIV infection (9.8%), COPD-chronic obstructive pulmonary disease (8.2%) and pneumonia/influenza (8.1%) rose steeply from 1992 to 1993. However, the ten leading causes of death didn’t change over that time period. The Centers for Disease Control said deaths due to heart disease, chronic obstructive pulmonary disease (COPD), HIV infection and pneumonia/influenza as well as diabetes made the largest contributions to the overall mortality increase. The cause(s) of the increase in the death rate were spread among various diseases by the Centers for Disease Control (CDC), far too broad to explain any single cause. Not a word was said about this startling setback in life expectancy.

But that same government document said some of these increases in chronic disease (diabetes, heart disease, COPD) were “the result of the two influenza epidemics of 1993.” [Page 9, Monthly Vital Statistics Report, Volume 44, No 7(S), Feb. 29, 1996]

1993: Two flu epidemics

What two flu epidemics is the report referring to?

A CDC review of mortality patterns in 1993 also states “the decline in life expectancy likely reflects increases in death rates for chronic disease during the two influenza outbreaks of 1993.” [Morbidity Mortality Weekly 45:08), 1161-64, March 1, 1996] There it is again, confirmation that two flu epidemics in the same year caused an increase in deaths with an admission it resulted in a decline in the life expectancy of Americans.

Timeline of historical flu outbreaks

Americans may be roughly familiar with the historical timeline of flu outbreaks provided in the chart below. The chart has been adapted to show the severity of each influenza outbreak and also the SARS coronavirus pandemic of 2003. I have added the 1993 flu outbreaks to the chart.

Note that the 1993 flu outbreak which resulted in nearly 93,000 more deaths than the prior year resulted in more deaths than the well-known Asian and Hong Kong flu pandemics and would be second only to the Spanish Flu pandemic of 1918 in comparable deaths. The Spanish flu had temporarily set back US life expectancy gains from 50.9 years to 39.1 years. Of course, this was the pre-antibiotic era. There were no anti-bacterial or anti-viral drugs then.

According to charts provided by the CDC and other health organizations, it’s as if there was no flu epidemic in the US in 1993. I had to dig deep into the health reports of that year to find further confirmation that it was the flu, and no other disease, that caused the American life expectancy to steeply decline for one year.

Data showed only 3,430 more deaths among HIV-infected residents then the prior year. [Morbidity Mortality 45: 121-25, 1996] Another study showed only 254 excess flu deaths among person with HIV for 1992-93 and only 191 the following year. [Archives Internal Medicine 161: 441-46, 2001] So HIV-infected persons, through at higher risk for death from the flu, cannot explain the unusual number of deaths attributed to influenza in 1993.

It’s also possible that flu vaccination rates declined in that year, but a quick search on Google found evidence to the contrary. Vaccination rates were rising while the flu outbreak of 1993 proceeded. (See chart below)

It struck elderly nursing home residents. But why?

So I began to re-read a government document I had flagged with a red paperclip during my investigation. A flu surveillance report published by the CDC states that the “1992-93 influenza season was dominated by influenza B, but increasing circulation of influenza A (H3N2) viruses toward the end of the season” which struck nursing home populations with deadly consequences.
For reference, type-A flu viruses are the most virulent and most common. Type B are less common but almost exclusively strike humans.

The report went on to say that influenza B viruses predominated early in the season and were mainly limited to school-age children, and “no excess mortality was observed.” Then sustained excess mortality began in mid-March of 1993 and coincided with outbreaks in nursing homes. [Morbidity & Mortality Weekly Report 46: (SS1), 1-12, Jan 31, 1997]

Like the more recent swine flu outbreak which began in Mexico, the second flu bout in 1993 began late in the season.

For comparison, the Mexico swine flu virus began in March or April of 2009 whereas the second 1993 flu outbreak began in March and peaked even later in August and September. The pathogenic virus involved in 1993 was identified as Type A H3N2 A/Beijing/32/92 strain. [Morbidity Mortality Weekly March 18, 1994 / 43(10); 179-183]

Still, why would the government hide such an epidemic, particularly the second one in 1993? I had no clue.

Free flu shots begin in 1993

I had uncovered much of this information over two years ago. But the reason for the cover-up remained elusive until I read a Health & Human Services press release issued in 1999. It said that Medicare coverage for flu shots for the elderly began in 1993 as the Administration launched an effort to increase immunization rates among older adults. The shots were free for those enrolled in Medicare Part B. The release can be found here:

The big difference from prior years was that elderly Americans were getting free flu shots.

According to The Vaccine Guide (North Atlantic Books, 2002), during the 1992-1993 season, 84 percent of samples for the predominant type A virus in circulation in the US population were not similar to the virus in the vaccine. The flu vaccine that year would be largely worthless. But that wouldn’t explain such a huge increase in deaths, particularly in nursing home populations that apparently hadn’t received flu shots in prior years due to lack of provisional funding.

There was a very slight increase in the risk for Guillain-Barré syndrome in the period 1992 to 1994 from flu shots (one additional case per million persons vaccinated). [New England Journal of Medicine 339: 1797-802, 1998] This would still not be sufficient to produce a setback in life expectancy.

A death vaccine?

Now the big question comes to mind. Was the flu vaccine in 1993 lethal in some way? This could be the only explanation as to why this deadly flu outbreak has been hidden from the public. If so, it would be a severe blow to the nation’s flu vaccination program.

There is a hint of evidence in Europe that either a deadly flu virus or a “death vaccine” was in circulation that year. Dutch National Influenza Centrum reported that nursing home residents in 1993 experienced a severe outbreak of the flu that struck 49% of them and caused 10% to die. That’s a death rate four times greater than the 1918 Spanish flu pandemic. The cause of the deaths was attributed to the Type A H3N2 flu viral strain. [Ned Tijdschr Geneeskd 1993 Sep 25; 137(39):1973-7]

Could there have been some deadly vaccine in use in the US in 1993? So-called “hot” lots of vaccines are not a matter of public record. Flu vaccines inject a “little bit of disease” to provoke the production of antibodies and produce long-term resistance to a particular strain of the flu. Nursing home patients are often frail and immune compromised. Every flu vaccine is a new invention, produced in advance of the next flu season and usually comprised of a new combination of three viral strains that virologists believe will be in circulation during the upcoming flu season. The three viral strains in these trivalent vaccines could have been deadly to frail elderly patients.

It is often stated that flu vaccines are comprised of “dead” or “attenuated” viruses. In fact, viruses are not alive, they are proteins and genetic material that require a host cell for replication. Virulent flu viruses are “grown” in mammalian eggs until less virulent strains are produced, which are then used in vaccines.

In the process of making a vaccine in this manner, a hidden virus may be introduced, such as the Simian 40 virus that was mistakenly introduced in the polio vaccine some years ago. New methods of making vaccines would eliminate this problem. But was a deadly combination of viruses hidden in the flu vaccine used in 1993? Certainly, no flu vaccine manufacturer would admit to that.

The FDA has been lax in its monitoring of flu manufacturing facilities. For example, in December of 2000 The Idaho Observer noted that Medeva, a British flu vaccine maker, had received a warning letter from the US FDA over filthy conditions in its manufacturing plant, but the FDA had given the green light to sell 20 million doses of its “Fluvarin” flu vaccine in the US in that same year without re-inspecting the plant.

2009 – Bill Sardi – All Rights Reserved

From News With Views, here.

America the Arrogant

A Brief History of the United States

In 1492 Columbus rediscovered America, and the settlers, destructively exploiting its vast resources, achieved a success which they attributed to their own near-miraculous virtues, some of which they actually had: courage, rude vigor, industry, and an independent spirit. Shortly after, they emerged from WWII unscathed due to the military genius embodied in two oceans while competitors – Europe, Russia, China, and Japan – lay prostrate. America’s intact military and an economy up and running allowed the establishment of a fairly benign empire and an astonishing commercial dominance, both being attributed to near-miraculous virtues and regarded as permanent.

They didn’t see it coming.

Japan revived and began producing something it called a Toyota while Detroit, sure of its market, manufactured lousy cars that arrived falling apart, final assembly by owner. Germany revived. Communism still protected America from China, and no one foresaw that this would change. Airbus Industries appeared, but no one believed that it could compete with American know-how and engineering. It did. One by one American manufacturers of airliners took shelter in the military market until only Boeing was left, more or less equal to Airbus. But Americans knew that Europe was socialist and had no work ethic.

Before long Japan had completely devoured the market for consumer electronics, cameras, and suchlike. Shipbuilding went, except for builders catering to the captive military market. The steel industry left for foreign shores. Few noticed. Americans knew that their prosperity sprang from their near-miraculous virtues, which foreigners could never achieve.

Eventually, China gave up on communism and became 1.3 billion smart, hardworking people who saw nothing wrong with the idea of becoming the world’s dominant power. Brazil began making airliners and American airlines began buying them. Even India showed signs of life. Americans didn’t worry because they knew that these funny countries couldn’t compete with America’s democratic values.

Manufacturing jobs began flowing to Asia, first a trickle and then a torrent. Americans didn’t pay attention, not knowing exactly where Asia was. Anyway, those foreigners were comic little people with squinty eyes and ate with sticks. Who could take them seriously? Then design work and programming began emigrating eastward. American had invented the Internet and now would pay the price. Intellectual capital had broken free from physical capital. Oops.

American industry largely ceased to exist, or at least ceased to be American. The big companies became free-floating international entities, adventitiously putting down roots wherever taxes were low and labor cheap, which wasn’t America. An HP laptop now consisted of a CPU from Intel but made perhaps in Ireland, the motherboard, hard drive, power supply and case made in Taiwan, RAM and screen from Samsung, assembled in Taiwan or China, but the label said HP, so it was American.

The trade balance went sour, and then very sour. The country had long since become captive to consumerism both national and individual, “He who dies with the most toys wins” being a bumper-sticker anthem. At every level America began living on credit, but America’s credit was good, which American’s attributed to near-miraculous virtues which they no longer had if they had ever had them.

As the economy invisibly declined, the military’s budget grew and grew. The country could no longer afford it, but the Pentagon was so deeply embedded in the economy and Congress that the country couldn’t stop affording it. The five-sided money hole spent on, an aging kept woman with no obvious purpose since, with the fall of the Soviet Union, America had no military enemies.

Consequences sometimes arrive tardily. After WWII, Zionists had conquered Palestine and begun mistreating its people in the manner of white South Africans at their worst. Moslems, of whom it later turned out there were quite a few, came to hate Zionists and, by extension, all Jews. Since America supplied the bombs that Israel used to kill Moslems, these came to hate the US. Thus 9/11. This was used as a pretext for war by hawkish wimps, now called Neocons. The conflicts were embraced by the Pentagon, which needed a raison d’etre in the face of the lack of enemies. The ensuing wars were enthusiastically supported by evangelicals, more Zionists, confused patriots, imperialists, military industry, and those who just wanted to kill some Arabs, any Arabs. President W. Bush with his eternal martial priapism and yokel grasp was just the man. The military budget now was about a trillion a year in a country that owed more money than it could ever repay.

Many things had changed since the arrival of Columbus and smallpox. Americans still imagined themselves as Marlboro Man, rugged individualists, though many had never actually seen a live horse. In fact, the country had become a society of mass conformist consumerism with its tastes designed at corporate. America was still a land of opportunity if you were an Ivy techy with an IQ in excess of 180, but everybody else was pretty much screwed. Most people lived in velvet serfdom, afraid of the boss and imprisoned by the retirement system. Few young males could any longer meet the physical requirements for induction. The Army softened training so they could appear to get through. So much for Davy Crockett.

Americans had become the Frightened People, afraid of terror, of Moslems, of an outside world they couldn’t find or, in many cases, spell. The government used this bounty from heaven to justify rapid elimination of civil liberties, telling the public that it was to protect them. They still prided themselves on their democracy, without any longer having one, and on being a light to the world, which hated them. “The whole world hates us. What is wrong with the whole world?” they asked, deeply puzzled.

The looters came. In the past, there had been an element of noblesse oblige, of concern for the nation, a sense among the upper classes that they ought to pay some slight attention to keeping the country alive while picking its bones. This changed. The country was now ruled by the tightly interlocking directorates of Wall Street, Congress, the upper reaches of the executive branch, and the big corporations, none of whose members had ever worked a night shift at Wal-Mart while living in a rented trailer. The worst and brightest went to Harvard and then into i-banking. Thus the sub-prime adventure. This catastrophe was regarded as a cyclical correction instead of as the first notes of the knell.

By this time the country was acquiring the attributes of the Third World. Impunity: financiers did not go to jail for financial crimes, nor generals for war crimes, nor congressmen for anything. National incapacity: The government handled natural disasters with the adroitness one might expect of Burundi. Intractable slums festered in the cores of its great cities. Over its age, America had achieved greatly, done much that was admirable and much that wasn’t, and now, overreaching, still convinced of its miraculous virtues, was perilously close to falling on its face.

From LRC, here.

The Dirty History of the Minimum Wage Law: EUGENICS

Listen or read the transcript from The Corbett Report here.

An excerpt:

The early progressive economists made no attempt to hide their eugenic motivations in promoting minimum wage laws.

Take Henry Rogers Seager, a Columbia economist and president of the American Association for Labor Legislation, who wrote in a key paper on the minimum-wage law published in The Annals of the American Academy in 1913:

“If we are to maintain a race that is to be made up of capable, efficient and independent individuals and family groups we must courageously cut off lines of heredity that have been proved to be undesirable by isolation or sterilization of the congenitally defective. Michigan has just passed an act requiring the sterilization of congenital idiots. This may seem somewhat remote from the minimum wage but such a policy judiciously extended should make easier the task of each on-coming generation which insists that every individual who is regularly employed in the competitive labor market shall receive at least a living wage for his work.”

In 1910, Royal Meeker, a Princeton economist who served as Woodrow Wilson’s U.S. Commissioner of Labor, opined that:

“It is much better to enact a minimum-wage law, even if it deprives these unfortunates of work. Better that the state should support the inefficient wholly and prevent the multiplication of the breed than subsidize incompetence and unthrift, enabling them to bring forth more of their kind.”

Arthur Holcombe, a Professor of Government at Harvard and a member of the Massachusetts Minimum Wage Commission, wrote approvingly of how Australia’s own minimum wage laws:

“. . . protect the white Australian’s standard of living from the invidious competition of the colored races, particularly of the Chinese.”

This is the real history of the minimum wage in America. By their own admission, a belief in the eugenic effect of eradicating the lower classes from the gene pool is the reason that its early progressive proponents advocated for minimum wage laws at all.

Of course, no one is suggesting that the people marching under the Fight For $15 banner are eugenicists, or that they are trying to exterminate the “defective germplasm” of the “unemployables.” This is patently not the case.

Modern-day progressives instead turn to newer economic models and theories to defend their “living wage” movement. A highly cited 1994 study by Princeton economists David Card and Alan Krueger, for instance, purported to find that raises in minimum wage actually had, if anything, a positive effect on employment. If Card and Krueger’s findings are true, then, modern progressives might argue, it doesn’t matter why economists originally supported wage floors; the point is that they offer the working poor a hand up.

As the Minimum Wage Study at the University of Washington and similar research being conducted across academia are increasingly discovering, however, Card and Krueger’s paper (called an “intellectual revolution” by Paul Krugman) is incorrect or at the very least leaves out important details about the minimum wage’s true impact.

Read the rest here.