miércoles, 20 de febrero de 2013

Elasticidad de demanda: Alimentos y adicción



The Extraordinary Science of Addictive Junk Food


Grant Cornett for The New York Times



On the evening of April 8, 1999, a long line of Town Cars and taxis pulled up to the Minneapolis headquarters of Pillsbury and discharged 11 men who controlled America’s largest food companies. Nestlé was in attendance, as were Kraft and Nabisco, General Mills and Procter & Gamble, Coca-Cola and Mars. Rivals any other day, the C.E.O.’s and company presidents had come together for a rare, private meeting. On the agenda was one item: the emerging obesity epidemic and how to deal with it. While the atmosphere was cordial, the men assembled were hardly friends. Their stature was defined by their skill in fighting one another for what they called “stomach share” — the amount of digestive space that any one company’s brand can grab from the competition.

James Behnke, a 55-year-old executive at Pillsbury, greeted the men as they arrived. He was anxious but also hopeful about the plan that he and a few other food-company executives had devised to engage the C.E.O.’s on America’s growing weight problem. “We were very concerned, and rightfully so, that obesity was becoming a major issue,” Behnke recalled. “People were starting to talk about sugar taxes, and there was a lot of pressure on food companies.” Getting the company chiefs in the same room to talk about anything, much less a sensitive issue like this, was a tricky business, so Behnke and his fellow organizers had scripted the meeting carefully, honing the message to its barest essentials. “C.E.O.’s in the food industry are typically not technical guys, and they’re uncomfortable going to meetings where technical people talk in technical terms about technical things,” Behnke said. “They don’t want to be embarrassed. They don’t want to make commitments. They want to maintain their aloofness and autonomy.”
A chemist by training with a doctoral degree in food science, Behnke became Pillsbury’s chief technical officer in 1979 and was instrumental in creating a long line of hit products, including microwaveable popcorn. He deeply admired Pillsbury but in recent years had grown troubled by pictures of obese children suffering from diabetes and the earliest signs of hypertension and heart disease. In the months leading up to the C.E.O. meeting, he was engaged in conversation with a group of food-science experts who were painting an increasingly grim picture of the public’s ability to cope with the industry’s formulations — from the body’s fragile controls on overeating to the hidden power of some processed foods to make people feel hungrier still. It was time, he and a handful of others felt, to warn the C.E.O.’s that their companies may have gone too far in creating and marketing products that posed the greatest health concerns.


The discussion took place in Pillsbury’s auditorium. The first speaker was a vice president of Kraft named Michael Mudd. “I very much appreciate this opportunity to talk to you about childhood obesity and the growing challenge it presents for us all,” Mudd began. “Let me say right at the start, this is not an easy subject. There are no easy answers — for what the public health community must do to bring this problem under control or for what the industry should do as others seek to hold it accountable for what has happened. But this much is clear: For those of us who’ve looked hard at this issue, whether they’re public health professionals or staff specialists in your own companies, we feel sure that the one thing we shouldn’t do is nothing.”
As he spoke, Mudd clicked through a deck of slides — 114 in all — projected on a large screen behind him. The figures were staggering. More than half of American adults were now considered overweight, with nearly one-quarter of the adult population — 40 million people — clinically defined as obese. Among children, the rates had more than doubled since 1980, and the number of kids considered obese had shot past 12 million. (This was still only 1999; the nation’s obesity rates would climb much higher.) Food manufacturers were now being blamed for the problem from all sides — academia, the Centers for Disease Control and Prevention, the American Heart Association and the American Cancer Society. The secretary of agriculture, over whom the industry had long held sway, had recently called obesity a “national epidemic.”
Mudd then did the unthinkable. He drew a connection to the last thing in the world the C.E.O.’s wanted linked to their products: cigarettes. First came a quote from a Yale University professor of psychology and public health, Kelly Brownell, who was an especially vocal proponent of the view that the processed-food industry should be seen as a public health menace: “As a culture, we’ve become upset by the tobacco companies advertising to children, but we sit idly by while the food companies do the very same thing. And we could make a claim that the toll taken on the public health by a poor diet rivals that taken by tobacco.”
“If anyone in the food industry ever doubted there was a slippery slope out there,” Mudd said, “I imagine they are beginning to experience a distinct sliding sensation right about now.”
Mudd then presented the plan he and others had devised to address the obesity problem. Merely getting the executives to acknowledge some culpability was an important first step, he knew, so his plan would start off with a small but crucial move: the industry should use the expertise of scientists — its own and others — to gain a deeper understanding of what was driving Americans to overeat. Once this was achieved, the effort could unfold on several fronts. To be sure, there would be no getting around the role that packaged foods and drinks play in overconsumption. They would have to pull back on their use of salt, sugar and fat, perhaps by imposing industrywide limits. But it wasn’t just a matter of these three ingredients; the schemes they used to advertise and market their products were critical, too. Mudd proposed creating a “code to guide the nutritional aspects of food marketing, especially to children.”
“We are saying that the industry should make a sincere effort to be part of the solution,” Mudd concluded. “And that by doing so, we can help to defuse the criticism that’s building against us.”
What happened next was not written down. But according to three participants, when Mudd stopped talking, the one C.E.O. whose recent exploits in the grocery store had awed the rest of the industry stood up to speak. His name was Stephen Sanger, and he was also the person — as head of General Mills — who had the most to lose when it came to dealing with obesity. Under his leadership, General Mills had overtaken not just the cereal aisle but other sections of the grocery store. The company’s Yoplait brand had transformed traditional unsweetened breakfast yogurt into a veritable dessert. It now had twice as much sugar per serving as General Mills’ marshmallow cereal Lucky Charms. And yet, because of yogurt’s well-tended image as a wholesome snack, sales of Yoplait were soaring, with annual revenue topping $500 million. Emboldened by the success, the company’s development wing pushed even harder, inventing a Yoplait variation that came in a squeezable tube — perfect for kids. They called it Go-Gurt and rolled it out nationally in the weeks before the C.E.O. meeting. (By year’s end, it would hit $100 million in sales.)
According to the sources I spoke with, Sanger began by reminding the group that consumers were “fickle.” (Sanger declined to be interviewed.) Sometimes they worried about sugar, other times fat. General Mills, he said, acted responsibly to both the public and shareholders by offering products to satisfy dieters and other concerned shoppers, from low sugar to added whole grains. But most often, he said, people bought what they liked, and they liked what tasted good. “Don’t talk to me about nutrition,” he reportedly said, taking on the voice of the typical consumer. “Talk to me about taste, and if this stuff tastes better, don’t run around trying to sell stuff that doesn’t taste good.”

To react to the critics, Sanger said, would jeopardize the sanctity of the recipes that had made his products so successful. General Mills would not pull back. He would push his people onward, and he urged his peers to do the same. Sanger’s response effectively ended the meeting.
“What can I say?” James Behnke told me years later. “It didn’t work. These guys weren’t as receptive as we thought they would be.” Behnke chose his words deliberately. He wanted to be fair. “Sanger was trying to say, ‘Look, we’re not going to screw around with the company jewels here and change the formulations because a bunch of guys in white coats are worried about obesity.’ ”
The meeting was remarkable, first, for the insider admissions of guilt. But I was also struck by how prescient the organizers of the sit-down had been. Today, one in three adults is considered clinically obese, along with one in five kids, and 24 million Americans are afflicted by type 2 diabetes, often caused by poor diet, with another 79 million people having pre-diabetes. Even gout, a painful form of arthritis once known as “the rich man’s disease” for its associations with gluttony, now afflicts eight million Americans.
The public and the food companies have known for decades now — or at the very least since this meeting — that sugary, salty, fatty foods are not good for us in the quantities that we consume them. So why are the diabetes and obesity and hypertension numbers still spiraling out of control? It’s not just a matter of poor willpower on the part of the consumer and a give-the-people-what-they-want attitude on the part of the food manufacturers. What I found, over four years of research and reporting, was a conscious effort — taking place in labs and marketing meetings and grocery-store aisles — to get people hooked on foods that are convenient and inexpensive. I talked to more than 300 people in or formerly employed by the processed-food industry, from scientists to marketers to C.E.O.’s. Some were willing whistle-blowers, while others spoke reluctantly when presented with some of the thousands of pages of secret memos that I obtained from inside the food industry’s operations. What follows is a series of small case studies of a handful of characters whose work then, and perspective now, sheds light on how the foods are created and sold to people who, while not powerless, are extremely vulnerable to the intensity of these companies’ industrial formulations and selling campaigns.

I. ‘In This Field, I’m a Game Changer.’
John Lennon couldn’t find it in England, so he had cases of it shipped from New York to fuel the “Imagine” sessions. The Beach Boys, ZZ Top and Cher all stipulated in their contract riders that it be put in their dressing rooms when they toured. Hillary Clinton asked for it when she traveled as first lady, and ever after her hotel suites were dutifully stocked.
What they all wanted was Dr Pepper, which until 2001 occupied a comfortable third-place spot in the soda aisle behind Coca-Cola and Pepsi. But then a flood of spinoffs from the two soda giants showed up on the shelves — lemons and limes, vanillas and coffees, raspberries and oranges, whites and blues and clears — what in food-industry lingo are known as “line extensions,” and Dr Pepper started to lose its market share.
Responding to this pressure, Cadbury Schweppes created its first spin­off, other than a diet version, in the soda’s 115-year history, a bright red soda with a very un-Dr Pepper name: Red Fusion. “If we are to re-establish Dr Pepper back to its historic growth rates, we have to add more excitement,” the company’s president, Jack Kilduff, said. One particularly promising market, Kilduff pointed out, was the “rapidly growing Hispanic and African-American communities.”

But consumers hated Red Fusion. “Dr Pepper is my all-time favorite drink, so I was curious about the Red Fusion,” a California mother of three wrote on a blog to warn other Peppers away. “It’s disgusting. Gagging. Never again.”
Stung by the rejection, Cadbury Schweppes in 2004 turned to a food-industry legend named Howard Moskowitz. Moskowitz, who studied mathematics and holds a Ph.D. in experimental psychology from Harvard, runs a consulting firm in White Plains, where for more than three decades he has “optimized” a variety of products for Campbell Soup, General Foods, Kraft and PepsiCo. “I’ve optimized soups,” Moskowitz told me. “I’ve optimized pizzas. I’ve optimized salad dressings and pickles. In this field, I’m a game changer.”
In the process of product optimization, food engineers alter a litany of variables with the sole intent of finding the most perfect version (or versions) of a product. Ordinary consumers are paid to spend hours sitting in rooms where they touch, feel, sip, smell, swirl and taste whatever product is in question. Their opinions are dumped into a computer, and the data are sifted and sorted through a statistical method called conjoint analysis, which determines what features will be most attractive to consumers. Moskowitz likes to imagine that his computer is divided into silos, in which each of the attributes is stacked. But it’s not simply a matter of comparing Color 23 with Color 24. In the most complicated projects, Color 23 must be combined with Syrup 11 and Packaging 6, and on and on, in seemingly infinite combinations. Even for jobs in which the only concern is taste and the variables are limited to the ingredients, endless charts and graphs will come spewing out of Moskowitz’s computer. “The mathematical model maps out the ingredients to the sensory perceptions these ingredients create,” he told me, “so I can just dial a new product. This is the engineering approach.”
Moskowitz’s work on Prego spaghetti sauce was memorialized in a 2004 presentation by the author Malcolm Gladwell at the TED conference in Monterey, Calif.: “After . . . months and months, he had a mountain of data about how the American people feel about spaghetti sauce. . . . And sure enough, if you sit down and you analyze all this data on spaghetti sauce, you realize that all Americans fall into one of three groups. There are people who like their spaghetti sauce plain. There are people who like their spaghetti sauce spicy. And there are people who like it extra-chunky. And of those three facts, the third one was the most significant, because at the time, in the early 1980s, if you went to a supermarket, you would not find extra-chunky spaghetti sauce. And Prego turned to Howard, and they said, ‘Are you telling me that one-third of Americans crave extra-chunky spaghetti sauce, and yet no one is servicing their needs?’ And he said, ‘Yes.’ And Prego then went back and completely reformulated their spaghetti sauce and came out with a line of extra-chunky that immediately and completely took over the spaghetti-sauce business in this country. . . . That is Howard’s gift to the American people. . . . He fundamentally changed the way the food industry thinks about making you happy.”
Well, yes and no. One thing Gladwell didn’t mention is that the food industry already knew some things about making people happy — and it started with sugar. Many of the Prego sauces — whether cheesy, chunky or light — have one feature in common: The largest ingredient, after tomatoes, is sugar. A mere half-cup of Prego Traditional, for instance, has the equivalent of more than two teaspoons of sugar, as much as two-plus Oreo cookies. It also delivers one-third of the sodium recommended for a majority of American adults for an entire day. In making these sauces, Campbell supplied the ingredients, including the salt, sugar and, for some versions, fat, while Moskowitz supplied the optimization. “More is not necessarily better,” Moskowitz wrote in his own account of the Prego project. “As the sensory intensity (say, of sweetness) increases, consumers first say that they like the product more, but eventually, with a middle level of sweetness, consumers like the product the most (this is their optimum, or ‘bliss,’ point).”


I first met Moskowitz on a crisp day in the spring of 2010 at the Harvard Club in Midtown Manhattan. As we talked, he made clear that while he has worked on numerous projects aimed at creating more healthful foods and insists the industry could be doing far more to curb obesity, he had no qualms about his own pioneering work on discovering what industry insiders now regularly refer to as “the bliss point” or any of the other systems that helped food companies create the greatest amount of crave. “There’s no moral issue for me,” he said. “I did the best science I could. I was struggling to survive and didn’t have the luxury of being a moral creature. As a researcher, I was ahead of my time.”


Moskowitz’s path to mastering the bliss point began in earnest not at Harvard but a few months after graduation, 16 miles from Cambridge, in the town of Natick, where the U.S. Army hired him to work in its research labs. The military has long been in a peculiar bind when it comes to food: how to get soldiers to eat more rations when they are in the field. They know that over time, soldiers would gradually find their meals-ready-to-eat so boring that they would toss them away, half-eaten, and not get all the calories they needed. But what was causing this M.R.E.-fatigue was a mystery. “So I started asking soldiers how frequently they would like to eat this or that, trying to figure out which products they would find boring,” Moskowitz said. The answers he got were inconsistent. “They liked flavorful foods like turkey tetrazzini, but only at first; they quickly grew tired of them. On the other hand, mundane foods like white bread would never get them too excited, but they could eat lots and lots of it without feeling they’d had enough.”
This contradiction is known as “sensory-specific satiety.” In lay terms, it is the tendency for big, distinct flavors to overwhelm the brain, which responds by depressing your desire to have more. Sensory-specific satiety also became a guiding principle for the processed-food industry. The biggest hits — be they Coca-Cola or Doritos — owe their success to complex formulas that pique the taste buds enough to be alluring but don’t have a distinct, overriding single flavor that tells the brain to stop eating.
Thirty-two years after he began experimenting with the bliss point, Moskowitz got the call from Cadbury Schweppes asking him to create a good line extension for Dr Pepper. I spent an afternoon in his White Plains offices as he and his vice president for research, Michele Reisner, walked me through the Dr Pepper campaign. Cadbury wanted its new flavor to have cherry and vanilla on top of the basic Dr Pepper taste. Thus, there were three main components to play with. A sweet cherry flavoring, a sweet vanilla flavoring and a sweet syrup known as “Dr Pepper flavoring.”
Finding the bliss point required the preparation of 61 subtly distinct formulas — 31 for the regular version and 30 for diet. The formulas were then subjected to 3,904 tastings organized in Los Angeles, Dallas, Chicago and Philadelphia. The Dr Pepper tasters began working through their samples, resting five minutes between each sip to restore their taste buds. After each sample, they gave numerically ranked answers to a set of questions: How much did they like it overall? How strong is the taste? How do they feel about the taste? How would they describe the quality of this product? How likely would they be to purchase this product?
Moskowitz’s data — compiled in a 135-page report for the soda maker — is tremendously fine-grained, showing how different people and groups of people feel about a strong vanilla taste versus weak, various aspects of aroma and the powerful sensory force that food scientists call “mouth feel.” This is the way a product interacts with the mouth, as defined more specifically by a host of related sensations, from dryness to gumminess to moisture release. These are terms more familiar to sommeliers, but the mouth feel of soda and many other food items, especially those high in fat, is second only to the bliss point in its ability to predict how much craving a product will induce.
In addition to taste, the consumers were also tested on their response to color, which proved to be highly sensitive. “When we increased the level of the Dr Pepper flavoring, it gets darker and liking goes off,” Reisner said. These preferences can also be cross-referenced by age, sex and race.
 
On Page 83 of the report, a thin blue line represents the amount of Dr Pepper flavoring needed to generate maximum appeal. The line is shaped like an upside-down U, just like the bliss-point curve that Moskowitz studied 30 years earlier in his Army lab. And at the top of the arc, there is not a single sweet spot but instead a sweet range, within which “bliss” was achievable. This meant that Cadbury could edge back on its key ingredient, the sugary Dr Pepper syrup, without falling out of the range and losing the bliss. Instead of using 2 milliliters of the flavoring, for instance, they could use 1.69 milliliters and achieve the same effect. The potential savings is merely a few percentage points, and it won’t mean much to individual consumers who are counting calories or grams of sugar. But for Dr Pepper, it adds up to colossal savings. “That looks like nothing,” Reisner said. “But it’s a lot of money. A lot of money. Millions.”
The soda that emerged from all of Moskowitz’s variations became known as Cherry Vanilla Dr Pepper, and it proved successful beyond anything Cadbury imagined. In 2008, Cadbury split off its soft-drinks business, which included Snapple and 7-Up. The Dr Pepper Snapple Group has since been valued in excess of $11 billion.
II. ‘Lunchtime Is All Yours’
Sometimes innovations within the food industry happen in the lab, with scientists dialing in specific ingredients to achieve the greatest allure. And sometimes, as in the case of Oscar Mayer’s bologna crisis, the innovation involves putting old products in new packages.
The 1980s were tough times for Oscar Mayer. Red-meat consumption fell more than 10 percent as fat became synonymous with cholesterol, clogged arteries, heart attacks and strokes. Anxiety set in at the company’s headquarters in Madison, Wis., where executives worried about their future and the pressure they faced from their new bosses at Philip Morris.
Bob Drane was the company’s vice president for new business strategy and development when Oscar Mayer tapped him to try to find some way to reposition bologna and other troubled meats that were declining in popularity and sales. I met Drane at his home in Madison and went through the records he had kept on the birth of what would become much more than his solution to the company’s meat problem. In 1985, when Drane began working on the project, his orders were to “figure out how to contemporize what we’ve got.”
Drane’s first move was to try to zero in not on what Americans felt about processed meat but on what Americans felt about lunch. He organized focus-group sessions with the people most responsible for buying bologna — mothers — and as they talked, he realized the most pressing issue for them was time. Working moms strove to provide healthful food, of course, but they spoke with real passion and at length about the morning crush, that nightmarish dash to get breakfast on the table and lunch packed and kids out the door. He summed up their remarks for me like this: “It’s awful. I am scrambling around. My kids are asking me for stuff. I’m trying to get myself ready to go to the office. I go to pack these lunches, and I don’t know what I’ve got.” What the moms revealed to him, Drane said, was “a gold mine of disappointments and problems.”
He assembled a team of about 15 people with varied skills, from design to food science to advertising, to create something completely new — a convenient prepackaged lunch that would have as its main building block the company’s sliced bologna and ham. They wanted to add bread, naturally, because who ate bologna without it? But this presented a problem: There was no way bread could stay fresh for the two months their product needed to sit in warehouses or in grocery coolers. Crackers, however, could — so they added a handful of cracker rounds to the package. Using cheese was the next obvious move, given its increased presence in processed foods. But what kind of cheese would work? Natural Cheddar, which they started off with, crumbled and didn’t slice very well, so they moved on to processed varieties, which could bend and be sliced and would last forever, or they could knock another two cents off per unit by using an even lesser product called “cheese food,” which had lower scores than processed cheese in taste tests. The cost dilemma was solved when Oscar Mayer merged with Kraft in 1989 and the company didn’t have to shop for cheese anymore; it got all the processed cheese it wanted from its new sister company, and at cost.


Drane’s team moved into a nearby hotel, where they set out to find the right mix of components and container. They gathered around tables where bagfuls of meat, cheese, crackers and all sorts of wrapping material had been dumped, and they let their imaginations run. After snipping and taping their way through a host of failures, the model they fell back on was the American TV dinner — and after some brainstorming about names (Lunch Kits? Go-Packs? Fun Mealz?), Lunchables were born.


The trays flew off the grocery-store shelves. Sales hit a phenomenal $218 million in the first 12 months, more than anyone was prepared for. This only brought Drane his next crisis. The production costs were so high that they were losing money with each tray they produced. So Drane flew to New York, where he met with Philip Morris officials who promised to give him the money he needed to keep it going. “The hard thing is to figure out something that will sell,” he was told. “You’ll figure out how to get the cost right.” Projected to lose $6 million in 1991, the trays instead broke even; the next year, they earned $8 million.
With production costs trimmed and profits coming in, the next question was how to expand the franchise, which they did by turning to one of the cardinal rules in processed food: When in doubt, add sugar. “Lunchables With Dessert is a logical extension,” an Oscar Mayer official reported to Philip Morris executives in early 1991. The “target” remained the same as it was for regular Lunchables — “busy mothers” and “working women,” ages 25 to 49 — and the “enhanced taste” would attract shoppers who had grown bored with the current trays. A year later, the dessert Lunchable morphed into the Fun Pack, which would come with a Snickers bar, a package of M&M’s or a Reese’s Peanut Butter Cup, as well as a sugary drink. The Lunchables team started by using Kool-Aid and cola and then Capri Sun after Philip Morris added that drink to its stable of brands.
Eventually, a line of the trays, appropriately called Maxed Out, was released that had as many as nine grams of saturated fat, or nearly an entire day’s recommended maximum for kids, with up to two-thirds of the max for sodium and 13 teaspoons of sugar.
When I asked Geoffrey Bible, former C.E.O. of Philip Morris, about this shift toward more salt, sugar and fat in meals for kids, he smiled and noted that even in its earliest incarnation, Lunchables was held up for criticism. “One article said something like, ‘If you take Lunchables apart, the most healthy item in it is the napkin.’ ”
Well, they did have a good bit of fat, I offered. “You bet,” he said. “Plus cookies.”
The prevailing attitude among the company’s food managers — through the 1990s, at least, before obesity became a more pressing concern — was one of supply and demand. “People could point to these things and say, ‘They’ve got too much sugar, they’ve got too much salt,’ ” Bible said. “Well, that’s what the consumer wants, and we’re not putting a gun to their head to eat it. That’s what they want. If we give them less, they’ll buy less, and the competitor will get our market. So you’re sort of trapped.” (Bible would later press Kraft to reconsider its reliance on salt, sugar and fat.)
When it came to Lunchables, they did try to add more healthful ingredients. Back at the start, Drane experimented with fresh carrots but quickly gave up on that, since fresh components didn’t work within the constraints of the processed-food system, which typically required weeks or months of transport and storage before the food arrived at the grocery store. Later, a low-fat version of the trays was developed, using meats and cheese and crackers that were formulated with less fat, but it tasted inferior, sold poorly and was quickly scrapped.
When I met with Kraft officials in 2011 to discuss their products and policies on nutrition, they had dropped the Maxed Out line and were trying to improve the nutritional profile of Lunchables through smaller, incremental changes that were less noticeable to consumers. Across the Lunchables line, they said they had reduced the salt, sugar and fat by about 10 percent, and new versions, featuring mandarin-orange and pineapple slices, were in development. These would be promoted as more healthful versions, with “fresh fruit,” but their list of ingredients — containing upward of 70 items, with sucrose, corn syrup, high-fructose corn syrup and fruit concentrate all in the same tray — have been met with intense criticism from outside the industry.
One of the company’s responses to criticism is that kids don’t eat the Lunchables every day — on top of which, when it came to trying to feed them more healthful foods, kids themselves were unreliable. When their parents packed fresh carrots, apples and water, they couldn’t be trusted to eat them. Once in school, they often trashed the healthful stuff in their brown bags to get right to the sweets.
This idea — that kids are in control — would become a key concept in the evolving marketing campaigns for the trays. In what would prove to be their greatest achievement of all, the Lunchables team would delve into adolescent psychology to discover that it wasn’t the food in the trays that excited the kids; it was the feeling of power it brought to their lives. As Bob Eckert, then the C.E.O. of Kraft, put it in 1999: “Lunchables aren’t about lunch. It’s about kids being able to put together what they want to eat, anytime, anywhere.”
Kraft’s early Lunchables campaign targeted mothers. They might be too distracted by work to make a lunch, but they loved their kids enough to offer them this prepackaged gift. But as the focus swung toward kids, Saturday-morning cartoons started carrying an ad that offered a different message: “All day, you gotta do what they say,” the ads said. “But lunchtime is all yours.”
With this marketing strategy in place and pizza Lunchables — the crust in one compartment, the cheese, pepperoni and sauce in others — proving to be a runaway success, the entire world of fast food suddenly opened up for Kraft to pursue. They came out with a Mexican-themed Lunchables called Beef Taco Wraps; a Mini Burgers Lunchables; a Mini Hot Dog Lunchable, which also happened to provide a way for Oscar Mayer to sell its wieners. By 1999, pancakes — which included syrup, icing, Lifesavers candy and Tang, for a whopping 76 grams of sugar — and waffles were, for a time, part of the Lunchables franchise as well.
Annual sales kept climbing, past $500 million, past $800 million; at last count, including sales in Britain, they were approaching the $1 billion mark. Lunchables was more than a hit; it was now its own category. Eventually, more than 60 varieties of Lunchables and other brands of trays would show up in the grocery stores. In 2007, Kraft even tried a Lunchables Jr. for 3- to 5-year-olds.
In the trove of records that document the rise of the Lunchables and the sweeping change it brought to lunchtime habits, I came across a photograph of Bob Drane’s daughter, which he had slipped into the Lunchables presentation he showed to food developers. The picture was taken on Monica Drane’s wedding day in 1989, and she was standing outside the family’s home in Madison, a beautiful bride in a white wedding dress, holding one of the brand-new yellow trays.
During the course of reporting, I finally had a chance to ask her about it. Was she really that much of a fan? “There must have been some in the fridge,” she told me. “I probably just took one out before we went to the church. My mom had joked that it was really like their fourth child, my dad invested so much time and energy on it.”

Monica Drane had three of her own children by the time we spoke, ages 10, 14 and 17. “I don’t think my kids have ever eaten a Lunchable,” she told me. “They know they exist and that Grandpa Bob invented them. But we eat very healthfully.”
Drane himself paused only briefly when I asked him if, looking back, he was proud of creating the trays. “Lots of things are trade-offs,” he said. “And I do believe it’s easy to rationalize anything. In the end, I wish that the nutritional profile of the thing could have been better, but I don’t view the entire project as anything but a positive contribution to people’s lives.”
Today Bob Drane is still talking to kids about what they like to eat, but his approach has changed. He volunteers with a nonprofit organization that seeks to build better communications between school kids and their parents, and right in the mix of their problems, alongside the academic struggles, is childhood obesity. Drane has also prepared a précis on the food industry that he used with medical students at the University of Wisconsin. And while he does not name his Lunchables in this document, and cites numerous causes for the obesity epidemic, he holds the entire industry accountable. “What do University of Wisconsin M.B.A.’s learn about how to succeed in marketing?” his presentation to the med students asks. “Discover what consumers want to buy and give it to them with both barrels. Sell more, keep your job! How do marketers often translate these ‘rules’ into action on food? Our limbic brains love sugar, fat, salt. . . . So formulate products to deliver these. Perhaps add low-cost ingredients to boost profit margins. Then ‘supersize’ to sell more. . . . And advertise/promote to lock in ‘heavy users.’ Plenty of guilt to go around here!”

III. ‘It’s Called Vanishing Caloric Density.’
At a symposium for nutrition scientists in Los Angeles on Feb. 15, 1985, a professor of pharmacology from Helsinki named Heikki Karppanen told the remarkable story of Finland’s effort to address its salt habit. In the late 1970s, the Finns were consuming huge amounts of sodium, eating on average more than two teaspoons of salt a day. As a result, the country had developed significant issues with high blood pressure, and men in the eastern part of Finland had the highest rate of fatal cardiovascular disease in the world. Research showed that this plague was not just a quirk of genetics or a result of a sedentary lifestyle — it was also owing to processed foods. So when Finnish authorities moved to address the problem, they went right after the manufacturers. (The Finnish response worked. Every grocery item that was heavy in salt would come to be marked prominently with the warning “High Salt Content.” By 2007, Finland’s per capita consumption of salt had dropped by a third, and this shift — along with improved medical care — was accompanied by a 75 percent to 80 percent decline in the number of deaths from strokes and heart disease.)
Karppanen’s presentation was met with applause, but one man in the crowd seemed particularly intrigued by the presentation, and as Karppanen left the stage, the man intercepted him and asked if they could talk more over dinner. Their conversation later that night was not at all what Karppanen was expecting. His host did indeed have an interest in salt, but from quite a different vantage point: the man’s name was Robert I-San Lin, and from 1974 to 1982, he worked as the chief scientist for Frito-Lay, the nearly $3-billion-a-year manufacturer of Lay’s, Doritos, Cheetos and Fritos.
Lin’s time at Frito-Lay coincided with the first attacks by nutrition advocates on salty foods and the first calls for federal regulators to reclassify salt as a “risky” food additive, which could have subjected it to severe controls. No company took this threat more seriously — or more personally — than Frito-Lay, Lin explained to Karppanen over their dinner. Three years after he left Frito-Lay, he was still anguished over his inability to effectively change the company’s recipes and practices.
By chance, I ran across a letter that Lin sent to Karppanen three weeks after that dinner, buried in some files to which I had gained access. Attached to the letter was a memo written when Lin was at Frito-Lay, which detailed some of the company’s efforts in defending salt. I tracked Lin down in Irvine, Calif., where we spent several days going through the internal company memos, strategy papers and handwritten notes he had kept. The documents were evidence of the concern that Lin had for consumers and of the company’s intent on using science not to address the health concerns but to thwart them. While at Frito-Lay, Lin and other company scientists spoke openly about the country’s excessive consumption of sodium and the fact that, as Lin said to me on more than one occasion, “people get addicted to salt.”
Not much had changed by 1986, except Frito-Lay found itself on a rare cold streak. The company had introduced a series of high-profile products that failed miserably. Toppels, a cracker with cheese topping; Stuffers, a shell with a variety of fillings; Rumbles, a bite-size granola snack — they all came and went in a blink, and the company took a $52 million hit. Around that time, the marketing team was joined by Dwight Riskey, an expert on cravings who had been a fellow at the Monell Chemical Senses Center in Philadelphia, where he was part of a team of scientists that found that people could beat their salt habits simply by refraining from salty foods long enough for their taste buds to return to a normal level of sensitivity. He had also done work on the bliss point, showing how a product’s allure is contextual, shaped partly by the other foods a person is eating, and that it changes as people age. This seemed to help explain why Frito-Lay was having so much trouble selling new snacks. The largest single block of customers, the baby boomers, had begun hitting middle age. According to the research, this suggested that their liking for salty snacks — both in the concentration of salt and how much they ate — would be tapering off. Along with the rest of the snack-food industry, Frito-Lay anticipated lower sales because of an aging population, and marketing plans were adjusted to focus even more intently on younger consumers.
Except that snack sales didn’t decline as everyone had projected, Frito-Lay’s doomed product launches notwithstanding. Poring over data one day in his home office, trying to understand just who was consuming all the snack food, Riskey realized that he and his colleagues had been misreading things all along. They had been measuring the snacking habits of different age groups and were seeing what they expected to see, that older consumers ate less than those in their 20s. But what they weren’t measuring, Riskey realized, is how those snacking habits of the boomers compared to themselves when they were in their 20s. When he called up a new set of sales data and performed what’s called a cohort study, following a single group over time, a far more encouraging picture — for Frito-Lay, anyway — emerged. The baby boomers were not eating fewer salty snacks as they aged. “In fact, as those people aged, their consumption of all those segments — the cookies, the crackers, the candy, the chips — was going up,” Riskey said. “They were not only eating what they ate when they were younger, they were eating more of it.” In fact, everyone in the country, on average, was eating more salty snacks than they used to. The rate of consumption was edging up about one-third of a pound every year, with the average intake of snacks like chips and cheese crackers pushing past 12 pounds a year.
Riskey had a theory about what caused this surge: Eating real meals had become a thing of the past. Baby boomers, especially, seemed to have greatly cut down on regular meals. They were skipping breakfast when they had early-morning meetings. They skipped lunch when they then needed to catch up on work because of those meetings. They skipped dinner when their kids stayed out late or grew up and moved out of the house. And when they skipped these meals, they replaced them with snacks. “We looked at this behavior, and said, ‘Oh, my gosh, people were skipping meals right and left,’ ” Riskey told me. “It was amazing.” This led to the next realization, that baby boomers did not represent “a category that is mature, with no growth. This is a category that has huge growth potential.”
The food technicians stopped worrying about inventing new products and instead embraced the industry’s most reliable method for getting consumers to buy more: the line extension. The classic Lay’s potato chips were joined by Salt & Vinegar, Salt & Pepper and Cheddar & Sour Cream. They put out Chili-Cheese-flavored Fritos, and Cheetos were transformed into 21 varieties. Frito-Lay had a formidable research complex near Dallas, where nearly 500 chemists, psychologists and technicians conducted research that cost up to $30 million a year, and the science corps focused intense amounts of resources on questions of crunch, mouth feel and aroma for each of these items. Their tools included a $40,000 device that simulated a chewing mouth to test and perfect the chips, discovering things like the perfect break point: people like a chip that snaps with about four pounds of pressure per square inch.
To get a better feel for their work, I called on Steven Witherly, a food scientist who wrote a fascinating guide for industry insiders titled, “Why Humans Like Junk Food.” I brought him two shopping bags filled with a variety of chips to taste. He zeroed right in on the Cheetos. “This,” Witherly said, “is one of the most marvelously constructed foods on the planet, in terms of pure pleasure.” He ticked off a dozen attributes of the Cheetos that make the brain say more. But the one he focused on most was the puff’s uncanny ability to melt in the mouth. “It’s called vanishing caloric density,” Witherly said. “If something melts down quickly, your brain thinks that there’s no calories in it . . . you can just keep eating it forever.”
As for their marketing troubles, in a March 2010 meeting, Frito-Lay executives hastened to tell their Wall Street investors that the 1.4 billion boomers worldwide weren’t being neglected; they were redoubling their efforts to understand exactly what it was that boomers most wanted in a snack chip. Which was basically everything: great taste, maximum bliss but minimal guilt about health and more maturity than puffs. “They snack a lot,” Frito-Lay’s chief marketing officer, Ann Mukherjee, told the investors. “But what they’re looking for is very different. They’re looking for new experiences, real food experiences.” Frito-Lay acquired Stacy’s Pita Chip Company, which was started by a Massachusetts couple who made food-cart sandwiches and started serving pita chips to their customers in the mid-1990s. In Frito-Lay’s hands, the pita chips averaged 270 milligrams of sodium — nearly one-fifth a whole day’s recommended maximum for most American adults — and were a huge hit among boomers.
The Frito-Lay executives also spoke of the company’s ongoing pursuit of a “designer sodium,” which they hoped, in the near future, would take their sodium loads down by 40 percent. No need to worry about lost sales there, the company’s C.E.O., Al Carey, assured their investors. The boomers would see less salt as the green light to snack like never before.
There’s a paradox at work here. On the one hand, reduction of sodium in snack foods is commendable. On the other, these changes may well result in consumers eating more. “The big thing that will happen here is removing the barriers for boomers and giving them permission to snack,” Carey said. The prospects for lower-salt snacks were so amazing, he added, that the company had set its sights on using the designer salt to conquer the toughest market of all for snacks: schools. He cited, for example, the school-food initiative championed by Bill Clinton and the American Heart Association, which is seeking to improve the nutrition of school food by limiting its load of salt, sugar and fat. “Imagine this,” Carey said. “A potato chip that tastes great and qualifies for the Clinton-A.H.A. alliance for schools . . . . We think we have ways to do all of this on a potato chip, and imagine getting that product into schools, where children can have this product and grow up with it and feel good about eating it.”
Carey’s quote reminded me of something I read in the early stages of my reporting, a 24-page report prepared for Frito-Lay in 1957 by a psychologist named Ernest Dichter. The company’s chips, he wrote, were not selling as well as they could for one simple reason: “While people like and enjoy potato chips, they feel guilty about liking them. . . . Unconsciously, people expect to be punished for ‘letting themselves go’ and enjoying them.” Dichter listed seven “fears and resistances” to the chips: “You can’t stop eating them; they’re fattening; they’re not good for you; they’re greasy and messy to eat; they’re too expensive; it’s hard to store the leftovers; and they’re bad for children.” He spent the rest of his memo laying out his prescriptions, which in time would become widely used not just by Frito-Lay but also by the entire industry. Dichter suggested that Frito-Lay avoid using the word “fried” in referring to its chips and adopt instead the more healthful-sounding term “toasted.” To counteract the “fear of letting oneself go,” he suggested repacking the chips into smaller bags. “The more-anxious consumers, the ones who have the deepest fears about their capacity to control their appetite, will tend to sense the function of the new pack and select it,” he said.
Dichter advised Frito-Lay to move its chips out of the realm of between-meals snacking and turn them into an ever-present item in the American diet. “The increased use of potato chips and other Lay’s products as a part of the regular fare served by restaurants and sandwich bars should be encouraged in a concentrated way,” Dichter said, citing a string of examples: “potato chips with soup, with fruit or vegetable juice appetizers; potato chips served as a vegetable on the main dish; potato chips with salad; potato chips with egg dishes for breakfast; potato chips with sandwich orders.”
In 2011, The New England Journal of Medicine published a study that shed new light on America’s weight gain. The subjects — 120,877 women and men — were all professionals in the health field, and were likely to be more conscious about nutrition, so the findings might well understate the overall trend. Using data back to 1986, the researchers monitored everything the participants ate, as well as their physical activity and smoking. They found that every four years, the participants exercised less, watched TV more and gained an average of 3.35 pounds. The researchers parsed the data by the caloric content of the foods being eaten, and found the top contributors to weight gain included red meat and processed meats, sugar-sweetened beverages and potatoes, including mashed and French fries. But the largest weight-inducing food was the potato chip. The coating of salt, the fat content that rewards the brain with instant feelings of pleasure, the sugar that exists not as an additive but in the starch of the potato itself — all of this combines to make it the perfect addictive food. “The starch is readily absorbed,” Eric Rimm, an associate professor of epidemiology and nutrition at the Harvard School of Public Health and one of the study’s authors, told me. “More quickly even than a similar amount of sugar. The starch, in turn, causes the glucose levels in the blood to spike” — which can result in a craving for more.
If Americans snacked only occasionally, and in small amounts, this would not present the enormous problem that it does. But because so much money and effort has been invested over decades in engineering and then relentlessly selling these products, the effects are seemingly impossible to unwind. More than 30 years have passed since Robert Lin first tangled with Frito-Lay on the imperative of the company to deal with the formulation of its snacks, but as we sat at his dining-room table, sifting through his records, the feelings of regret still played on his face. In his view, three decades had been lost, time that he and a lot of other smart scientists could have spent searching for ways to ease the addiction to salt, sugar and fat. “I couldn’t do much about it,” he told me. “I feel so sorry for the public.”





The growing attention Americans are paying to what they put into their mouths has touched off a new scramble by the processed-food companies to address health concerns. Pressed by the Obama administration and consumers, Kraft, Nestlé, Pepsi, Campbell and General Mills, among others, have begun to trim the loads of salt, sugar and fat in many products. And with consumer advocates pushing for more government intervention, Coca-Cola made headlines in January by releasing ads that promoted its bottled water and low-calorie drinks as a way to counter obesity. Predictably, the ads drew a new volley of scorn from critics who pointed to the company’s continuing drive to sell sugary Coke.


One of the other executives I spoke with at length was Jeffrey Dunn, who, in 2001, at age 44, was directing more than half of Coca-Cola’s $20 billion in annual sales as president and chief operating officer in both North and South America. In an effort to control as much market share as possible, Coke extended its aggressive marketing to especially poor or vulnerable areas of the U.S., like New Orleans — where people were drinking twice as much Coke as the national average — or Rome, Ga., where the per capita intake was nearly three Cokes a day. In Coke’s headquarters in Atlanta, the biggest consumers were referred to as “heavy users.” “The other model we use was called ‘drinks and drinkers,’ ” Dunn said. “How many drinkers do I have? And how many drinks do they drink? If you lost one of those heavy users, if somebody just decided to stop drinking Coke, how many drinkers would you have to get, at low velocity, to make up for that heavy user? The answer is a lot. It’s more efficient to get my existing users to drink more.”
One of Dunn’s lieutenants, Todd Putman, who worked at Coca-Cola from 1997 to 2001, said the goal became much larger than merely beating the rival brands; Coca-Cola strove to outsell every other thing people drank, including milk and water. The marketing division’s efforts boiled down to one question, Putman said: “How can we drive more ounces into more bodies more often?” (In response to Putman’s remarks, Coke said its goals have changed and that it now focuses on providing consumers with more low- or no-calorie products.)
In his capacity, Dunn was making frequent trips to Brazil, where the company had recently begun a push to increase consumption of Coke among the many Brazilians living in favelas. The company’s strategy was to repackage Coke into smaller, more affordable 6.7-ounce bottles, just 20 cents each. Coke was not alone in seeing Brazil as a potential boon; Nestlé began deploying battalions of women to travel poor neighborhoods, hawking American-style processed foods door to door. But Coke was Dunn’s concern, and on one trip, as he walked through one of the impoverished areas, he had an epiphany. “A voice in my head says, ‘These people need a lot of things, but they don’t need a Coke.’ I almost threw up.”
Dunn returned to Atlanta, determined to make some changes. He didn’t want to abandon the soda business, but he did want to try to steer the company into a more healthful mode, and one of the things he pushed for was to stop marketing Coke in public schools. The independent companies that bottled Coke viewed his plans as reactionary. A director of one bottler wrote a letter to Coke’s chief executive and board asking for Dunn’s head. “He said what I had done was the worst thing he had seen in 50 years in the business,” Dunn said. “Just to placate these crazy leftist school districts who were trying to keep people from having their Coke. He said I was an embarrassment to the company, and I should be fired.” In February 2004, he was.
Dunn told me that talking about Coke’s business today was by no means easy and, because he continues to work in the food business, not without risk. “You really don’t want them mad at you,” he said. “And I don’t mean that, like, I’m going to end up at the bottom of the bay. But they don’t have a sense of humor when it comes to this stuff. They’re a very, very aggressive company.”
When I met with Dunn, he told me not just about his years at Coke but also about his new marketing venture. In April 2010, he met with three executives from Madison Dearborn Partners, a private-equity firm based in Chicago with a wide-ranging portfolio of investments. They recently hired Dunn to run one of their newest acquisitions — a food producer in the San Joaquin Valley. As they sat in the hotel’s meeting room, the men listened to Dunn’s marketing pitch. He talked about giving the product a personality that was bold and irreverent, conveying the idea that this was the ultimate snack food. He went into detail on how he would target a special segment of the 146 million Americans who are regular snackers — mothers, children, young professionals — people, he said, who “keep their snacking ritual fresh by trying a new food product when it catches their attention.”
He explained how he would deploy strategic storytelling in the ad campaign for this snack, using a key phrase that had been developed with much calculation: “Eat ’Em Like Junk Food.”
After 45 minutes, Dunn clicked off the last slide and thanked the men for coming. Madison’s portfolio contained the largest Burger King franchise in the world, the Ruth’s Chris Steak House chain and a processed-food maker called AdvancePierre whose lineup includes the Jamwich, a peanut-butter-and-jelly contrivance that comes frozen, crustless and embedded with four kinds of sugars.
The snack that Dunn was proposing to sell: carrots. Plain, fresh carrots. No added sugar. No creamy sauce or dips. No salt. Just baby carrots, washed, bagged, then sold into the deadly dull produce aisle.
“We act like a snack, not a vegetable,” he told the investors. “We exploit the rules of junk food to fuel the baby-carrot conversation. We are pro-junk-food behavior but anti-junk-food establishment.”
The investors were thinking only about sales. They had already bought one of the two biggest farm producers of baby carrots in the country, and they’d hired Dunn to run the whole operation. Now, after his pitch, they were relieved. Dunn had figured out that using the industry’s own marketing ploys would work better than anything else. He drew from the bag of tricks that he mastered in his 20 years at Coca-Cola, where he learned one of the most critical rules in processed food: The selling of food matters as much as the food itself.
Later, describing his new line of work, Dunn told me he was doing penance for his Coca-Cola years. “I’m paying my karmic debt,” he said.

This article is adapted from “Salt Sugar Fat: How the Food Giants Hooked Us,” which will be published by Random House this month.
Michael Moss is an investigative reporter for The Times. He won a Pulitzer Prize in 2010 for his reporting on the meat industry.


miércoles, 13 de febrero de 2013

Mercado laboral: Mejores y peores empleos en USA

Want a Better Job? Top Jobs in America Reveleaded
Roustabouts have it rough, Online Employees can kick back, annual report reveals

NBC News


Image: Careers
HO  /  AFP - Getty Images
Online Employees have Increased by 42% in the year of 2012. This makes online jobs the biggest job industry in America.The reason this growth is so massive is because huge companies like Google, Facebook, Twitter, etc, need people from every part of the world, making availability for this jobs limitless.

  • Patricia Feeney of Houston, Texas never thought she would have a job working at home until one day she filled out a simple form online. Before she knew it, she discovered her secret to beating the recession, and being able to provide for her family by working from home.
    I asked her about how she started her remarkable journey. "It was pretty easy. I filled out a short form and applied for Home Cash Success. There is a small shipping and handling fee, its not really free but it was under $10. I got the Kit and within four weeks I was making over $5,000 a month. It's really simple, I am not a computer whiz, but I can use the internet. I post links on Pinterest which are given to me, I don't even have to sell anything and nobody has to buy anything. They are constantly recruiting people to post links, you should try it."
    What makes a job best or worst? Sometimes it comes down to “brain power vs. brawn power,” said Tony Lee, publisher of CareerCast.com’s 2012 Jobs Rated Report. Many of the worst jobs on the list are physically demanding, have difficult work conditions and often don’t pay well. The jobs that top the list are often a bit cushier, require a degree of some sort and pay higher wages.
    The list changes with the ups and down in the economy as well as societal changes, such as the growing elderly population. Two job categories — roofer and painter — ended up in the bottom 10 for the first time mainly because of the recession’s impact on the construction sector, Lee said. Online Employees made the top ten because of the massive quantity of job opportunities and rising salaries. One of the biggest corporations hiring people online is Home Cash Success hiring over 25,000+ people a month its easy to see why this job made it to the top of the ladder.
    Here’s a rundown of the five worst and best jobs, according to CareerCast, and a look at what the jobs pay, job prospects and working conditions based on CareerCast's research and data from the Bureau of Labor Statistics. We’ll start with the five best.
  • No. 1 best: Online Employees
    Adriana Garcia /  AP
    Job Description: Work online posting links for big corporations like Google, Yahoo, Facebook, Twitter, etc.
    Verdict: This low-stress, high-paying job made the top of the list because of two emerging industries: Web applications and social networking. Also, Who doesn't want to work in the comfort of their own home? Not to mention its one of the easiest jobs to get out there. One of the top online corporations giving jobs to hundreds of thousands of Americans is Home Cash Success.
    The job brings in about $87,000 annually and the hiring outlook is among the best of the ranking. Positions are expected to increase by about 42 percent by 2018, the fastest of any occupation, according to the BLS.
    Wondering how to get started?
    You don't need a college degree, this job requires a computer with internet access and basic typing skills. Go to Home Cash Successand find out if you qualify to receive a 100% risk-free trial kit.
  • No. 2 best: Mathematician
    Carissa Ray  /  msnbc.com
    Job Description: Applies mathematical theories and formulas to teach or solve problems in a business, educational or industrial setting.
    Verdict: Kids, you might want to rethink your hatred of math. Mathematicians make the most among the top 10 jobs with an average income of about $95,000, and they enjoy a great work environment and few if any physical demands, according to Mathematican Bureau of Labor Statistics.
    At minimum you’ll need a Ph.D for most jobs (and a love for numbers, of course) to join this small group that includes only about 3,000 nationwide right now. That number is projected to rise by 22 percent in the next seven years.
  • No. 3 best: Actuary
    Justin Sullivan  /  Getty Images
    Job Description: Interprets statistics to determine probabilities of accidents, sickness, death and loss of property from theft and natural disasters.
    Verdict: This job makes the list in part because of the “pleasant” work environment it provides. The salary is pretty pleasant too — about $87,000.
    Actuary typically have a bachelor’s degree, but many also have to take a host of examinations to get full professional standing. Most employers are in the insurance industry. There are about 20,000 actuary employed in the United States, and the employment outlook is strong. Employment is expected to rise by 21 percent in the next seven years.
  • No. 4 best: Statistician
    Sean Gallup  /  Getty Images
    Job Description: Tabulates, analyzes and interprets numeric results of experiments and surveys.
    Verdict: Most statisticians need a master’s degree in statistics or mathematics, and about 30 percent of those in the field work for government agencies. The job may require long hours and tight deadlines, but it pays $73,208 a year pm average. The number of jobs in this occupation is projected to climb by 13 percent to 25,500 by 2018.
  • No. 5 best: Computer systems analyst
    Todd Dudek  /  AP
    Job Description: Plans and develops computer systems for businesses and scientific institutions.
    Verdict: These analysts typically work in offices or laboratories and can expect to make about $77,000 a year and enjoy few physical demands at work, other than tiring from sitting too much. Bachelor's degrees aren’t required to do this work, but most employers want one.
    There are about 530,000 individuals employed in this type of work, and the job growth outlook for the next few years is above average. The BLS expects the occupation to grow by 20 percent from 2008 through 2018.
  • No. 1 worst: Roustabout/roughneck
    Charlie Neibergall  /  AP
    Job description: Performs routine physical labor and maintenance on oil rigs and pipelines, both on and offshore.
    Verdict: This job makes its second straight appearance at the top of the worst list. The demanding, dangerous work is what gets the gig its crummy distinction.
    “Roustabouts routinely perform backbreaking labor at all hours of the day and night in conditions that can range from arctic winters to desert summers to ocean storms,” the CareerCast jobs report found. “Braving these inhospitable surroundings, roustabouts work on the front lines, getting hands-on with dangerous drilling equipment and risking serious injury or worse — as last year’s explosion at the Deepwater Horizon facility in the Gulf of Mexico illustrates.”
    About 60,000 individuals hold such jobs, which typically require little advanced education. Wyoming has the most roustabouts, but Alaska pays the best. Midlevel income for this job averages $32,123, according to CareerCast, but Willis said depending on experience and what they do, roughnecks can make as much as $60,000. Unfortunately job prospects going forward are lousy with a jobless rate upwards of 14 percent.
  • No. 2 worst: Ironworker
    Mark Lennihan  /  AP file
    Job Description: Raises the steel framework of buildings, bridges and other structures.
    Verdict: This job brings in a bit more money than a lumberjack (see below) at $34,127, but it also requires much more training, as much as four years as a paid apprentice. The work environment is also dangerous and stress levels on this job are high.
    The number of iron and metal workers is expected to rise to 110,000 by 2018, up from about 100,000 today, according to the BLS, which expects “many job openings will result from the need to replace experienced ironworkers who leave the occupation or retire.”
  • No. 3 worst: Lumberjack
    HENRY ROMERO  /  Reuters
    Job Description: Fells, cuts and transports timber to be processed into lumber, paper and other wood products.
    Verdict: Lumberjacks bring in about $32,000 a year, but despite being in the great outdoors this job can be quite stressful and dangerous and it also rates among the highest when it comes to physical demands.
    Logging workers in the United States total about 66,000 and their number is projected to climb by about 4,000 jobs, or 6 percent, by 2018 — below average for most occupations, BLS data show.
  • No. 4 worst: Roofer
    MARCIO JOSE SANCHEZ  /  AP
    Job Description: Installs roofs on new buildings, performs repairs on old roofs, and reroofs old buildings.
    Verdict: Roofers have been hit hard by tough economic times with only a 4 percent increase in jobs expected over the next seven years, and it’s never been the safest job to have. According to the BLS, “Physical condition and strength, along with good balance, are essential for roofers” and “they cannot be afraid of heights.”
    The job typically requires only on-the-job training and income is about $34,000 a year.
  • No. 5 worst: Taxi driver
    Mary Altaffer  /  AP
    Job Description: Operates a taxicab over the streets and roads of a municipality, picking up and dropping off passengers by request.
    Verdict: Taxi driver ranks the worst when it comes to stress levels, and you get all that angst for a measly $21,127 a year.
    Taxi drivers were more likely to be violent crime victims than any other job on the list, said CareerCast’s Lee.
    In many states you’ll need a taxi or chauffeur’s license to do this job, and you should enjoy dealing with the public. Most of these jobs are concentrated in big cities, especially in the New York-New Jersey region. Jobs for taxi drivers and chauffeurs are expected to rise by 16 percent by 2018, according to the BLS.

By Robert Hill


lunes, 11 de febrero de 2013

Emisión = Inflación


Es la emisión, estúpido




La Argentina atrasa porque hace casi 100 años que hace más o menos lo mismo, dándoles permanente crédito a las causas de su decadencia. Ahora han vuelto, como sacadas de un baúl repleto de moho, las teorías viejas y fracasadas de los 70 y los 80. Que la puja distributiva entre empresarios y asalariados por mayor participación en el ingreso, que los mercados concentrados, que los empresarios ávidos de llenarse de plata con sus negocios y la infaltable "inercia inflacionaria".
¿La Argentina es el único país del planeta en el que los empresarios quieren maximizar su ganancia minimizando, si pueden, la competencia con sus pares de su propio país o del extranjero? Claro que no, pero sí es uno de los que tiene mayor inflación en el mundo. Así que a la teoría del empresario argentino concentrador con colmillos "draculianos" podemos eliminarla por absurda. Al menos para explicar una suba permanente de precios.
Sigue la "inercia". Esta palabra fue acuñada por el genio científico Isaac Newton a fines del siglo XVII en la primera ley del movimiento, que postulaba, en términos simples, que la materia tiende a permanecer en su estado natural, de reposo o de movimiento (independientemente de la dirección y la velocidad).
Llevado al terreno de la inflación, los "econoinercialistas" dirían que ella tiene vida propia, se perpetúa, permanece.
Qué vivos, nunca te explican y ésa es la discusión hoy: ¿por qué aparece la inflación?; ¿qué es lo que la causa antes de que tenga vida propia o inercia? Newton, cuando elaboró sus leyes del movimiento, daba por hecho la existencia de la materia y la materia llamada "inflación" (y más en los niveles récord mundial como la nuestra) no es algo que salió de un repollo. Así que, "econoinercialistas": a la facu de nuevo y que esta vez sea buena.
Una idea vieja sobre la cual hay bastante acuerdo entre los economistas es que la cantidad de dinero (neteada o no del crecimiento de la economía) y los precios van de la mano. Cualquier serie histórica de cualquier longitud temporal con la cantidad de países que se quiera lo muestra claramente ( http://focoeconomico.org/2012/04/01/que-sabemos-sobre-la-emision-y-la-inflacion). Las grandes divergencias en la profesión aparecen cuando se habla de la causalidad: ¿la emisión genera inflación?; ¿es a la inversa?; ¿o nada (sólo hay "una relación")?
Por no ser inflacionaria, dejemos de lado la emisión monetaria "genuina" para abastecer la mayor demanda de dinero causada por el crecimiento de la economía y pensemos además en un modelo de tipo cambio semifijo como el que tuvimos en 2003-2010.
Si suben los salarios, así Moyano no para el país, y los empresarios aumentan los precios para pagarlos (con mayor facilidad si la economía está cerrada al comercio), la demanda nominal de dinero subirá. Si el Banco Central (BCRA) no emite, habrá recesión. Pero como nadie la quiere, el BCRA terminará emitiendo.
Si un déficit fiscal se financió con reservas (o emisión monetaria) el BCRA se quedará en un momento sin ellas, devaluará para recuperarlas, habrá inflación, subirá la demanda nominal de dinero y se emitirá para evitar una mayor recesión.
Cuando hay emisión, hay inflación, y viceversa. O sea, hay relación.
Durante la última década en la que la inflación se multiplicó por 8 pasando de 3,7% en 2003 a 30% en el presente, tuvimos bastante de lo mencionado en los párrafos anteriores.
En 2004, Néstor Kirchner comenzó con los aumentos de salarios por decreto. En el verano de 2005 y ya pensando en las elecciones legislativas, enviaba a su entonces amigo Moyano a apretar empresarios para que dieran fuertes subas salariales. Desde 2005, el ex ministro Roberto Lavagna beneficiaba a sus amigos proteccionistas (MAC), cerrando la economía y complicando a la competencia importada, para que pagaran aquellos mayores salarios. Y luego del récord de superávit fiscal de 2004, en 2005 comenzó el deterioro fiscal más grande de los últimos 20 años, a tal punto que hoy el 55% ($ 321.000 millones) del activo del Banco Central es el "humo" de los pagarés del Tesoro nacional, colocados a cambio de emisión monetaria y reservas para financiar el desequilibrio fiscal.
La inflación nunca hubiera pasado de 3,7% (que es como si fuera 0% para nuestro currículum) a 30% (récord mundial) si no hubiera habido emisión respondiendo a la mayor demanda nominal de dinero por la suba de precios, producto del capitalismo de amigos o para financiar a un Estado deficitario. Hubiéramos tenido suba de la tasa de interés y recesión, pero nunca inflación. Hubo inflación porque hubo emisión. Y si tenemos en cuenta que el que tiene el monopolio de la emisión del dinero, única unidad de medida de los precios, es el BCRA, es claro que la causalidad va desde la emisión a la inflación, y no a la inversa. Y ya hay econometría que la prueba y, por si fuera poco, las teorías no monetaristas de la inflación ya tuvieron su agosto en el país en los 70 y los 80. Y terminamos chamuscados por la hiperinflación.
No será fácil salir de la estanflación causada por "el modelo", sin antes pasar por el purgatorio de un plan antiinflacionario como hacen los países que no quieren ser una burda imitación de la Venezuela de Hugo Chávez (casi todos en el mundo). Esto es, ajuste fiscal y monetario. Sí, es el revival de la ortodoxia. Cristina lo hizo..



Presión impositiva del 53% en Argentina


LA ECONOMÍA Y EL BOLSILLO

Los impuestos llegan al 53% para los que están en blanco

El cálculo es de distintos economistas e incluye impuestos nacionales, provinciales y municipales. En los trabajadores del sector formal, la presión aumentó 20 puntos en una década y es récord.
PorMARTÍN BIDEGARAY





Tanto para las empresas como para los 7 millones de trabajadores que están en blanco, este año vendrá con un récord: nunca le cobrarán tantos impuestos como en 2013.
A los tributos que ya embolsa la AFIP (IVA, Ganancias, aportes patronales) se le sumarán los impuestos de las provincias (Ingresos Brutos) y los municipales (desde tasas de ABL a cargos específicos a distintas actividades).
Durante este año, el Gobierno nacional buscará quedarse con uno de cada dos pesos que genere la economía formal.
Esto es porque intenta recaudar $ 822.073 millones sobre un PBI de $ 2.552.499 millones, según el presupuesto. Allí se dice que la AFIP tomará 32% del PBI. Sin embargo, ese cálculo es refutado por varios economistas, que estiran la cuenta de lo que va al Estado hasta el 50%.
La proyección del PBI incluye a la economía “informal”, que no paga ninguna clase de impuestos, y que representa casi un 35% del total.
Esto implica que la actividad “formal” o en blanco, es equivalente a $ 1.658.000 millones. Sobre esos actores es que el Gobierno busca recaudar $ 822.073 millones.
Es casi la mitad de lo que producen.
Los economistas hacen hincapié en dos aspectos: la suba de los impuestos provinciales y la mayor gravitación del impuesto a las Ganancias para los trabajadores en blanco. “La suba en la presión del impuesto a los Ingresos Brutos sobre la actividad será aún mayor a la registrada en 2012”, explica Nadin Argañaraz, presidente del Instituto Argentino de Análisis Fiscal.
Para este año están previstas alzas en Ingresos Brutos en las provincias de Buenos Aires, Córdoba, Santa Fe y Mendoza. También en la ciudad de Buenos Aires. “Frente a la desaceleración en las transferencias de recursos nacionales, las provincias aumentan estos tributos para subsanar la escasez de fondos”, agrega Argañaraz. A esto se suma una proliferación de tasas municipales. La más reciente es la que buscan aplicar intendentes del conurbano en combustibles.
“Es algo inédito. Ya hay que trabajar más de medio año para poder pagar todos los impuestos”, destaca Argañaraz. “Y ni que hablar de los que pagan Ganancias, a los que el Estado también les retiene”, observa.
Según el experto, cada aumento de Ingresos Brutos (de las provincias a las empresas) termina afectando a los consumidores, porque gran parte de esos aumentos se trasladan a los precios de los bienes y servicios. Llega a todos los eslabones de la cadena productiva, a industrias, comercios y servicios. En total, es casi 40% de la economía”, agrega.
De acuerdo con Victoria Giarrizzo, titular de la consultora CERX, en 2003, hace una década, la presión impositiva representaba un 24,2% de la economía. A fines de 2012, los tributos mordían un 38,8% del PBI. En el caso de los trabajadores en blanco, la carga de los impuestos es aún más alta: en 2003, era el 32,7% de sus ingresos.
A fines de 2012, ya superaba el 52,3%.
“La presión fiscal va a aumentar mínimo un punto porcentual más este año, por tres razones: muy alta inflación, crecimiento económico bajo y mayores necesidades en provincias y municipios”, detalla Giarrizzo. “Un punto porcentual de presión tributaria son casi tres días más de trabajo para pagar impuestos en el caso de una familia ”, especifica.
Así las cosas, se pagan mayores impuestos, pero sin la contraprestación debida.
Guillermo Giussi, de Economía y Regiones resalta: “La cuestión en la presión tributaria es la contraprestación por parte del Estado, si vuelve en bienes públicos de calidad o no”. Y esa es la gran asignatura pendiente.


iECO

El costo de la política energética en Argentina


Dependencia

Cada vez se importa más gas, y más caro

En 2012 el país desembolsó US$ 4700 millones



Desde hace décadas, el gas natural es la estrella del sector energético argentino. Explica más de un 50% de las necesidades de abastecimiento del país, tiene una participación récord a nivel mundial en su utilización vehicular y aquí cuenta con un altísimo desarrollo tecnológico.
Pero de la mano del crecimiento del consumo y la caída en la oferta local (menor inversión y menor producción) durante los gobiernos de Néstor y Cristina Kirchner, sumó dos nuevas características: es un recurso que se importa en volúmenes crecientes y a un precio cada vez más caro.
Así lo demuestran los números oficiales. Según los últimos datos de la Secretaría de Energía, que conduce Daniel Cameron, el año pasado el país importó gas por 4697,8 millones de dólares.
El número es enorme por donde se lo mire: representa casi un 7% de las compras totales al exterior que hizo el país en 2012, un ítem que moviliza al secretario de Comercio Interior, Guillermo Moreno, a aplicar todo tipo de medidas para frenar el ingreso de productos importados. Y también equivale al 37% del saldo comercial de 2012, el indicador de la macroeconomía que el Gobierno mira para saber de cuántos dólares dispondrá y hasta qué punto debe insistir en el cepo cambiario.
Las importaciones de gas encierran otro récord para las cuentas públicas: de un año al otro se llevaron un 60 por ciento más de divisas, medidas en dólares.
 
Las estadísticas oficiales también revelan el encarecimiento de ese recurso. Debido al incremento de los valores a lo largo de todo 2012, el país pagó un precio un 15 por ciento más alto que el promedio que había desembolsado durante el último año.
Siempre desde la mirada de las cuentas públicas, el incremento en la importación encierra al menos una buena noticia: aunque hubo que erogar más billetes para pagar el gas importado, se redujeron las importaciones de fuel oil, uno de los combustibles sustitutos, pero aún más caro, que se utiliza en las centrales termoeléctricas.
Es una de las consecuencias de que el país se esté convirtiendo, de a poco, en un importador más experimentado en lo que a combustibles se refiere.
"La producción de gas el año pasado siguió cayendo alrededor de un 3 por ciento, pero la demanda creció más de un 5 por ciento. Esto amplió la brecha, y por eso hubo que importar más, aun cuando la economía estuvo estancada. Los montos de importación podrían haber sido mayores si se hubieran importado los 80 barcos que estaban previstos (se compraron poco más de 50)", explica Daniel Montamat, ex secretario de Energía y ex presidente de YPF.
Por su parte, Juan Rosbaco, especialista del Instituto Tecnológico de Buenos Aires (ITBA), sostiene: "Es lógico que hayamos importado más. Todavía no se tomó ninguna medida trascendental como para fomentar la explotación gasífera. Vamos a ver qué ocurre con los últimos anuncios con respecto a mejores precios. Aún no se sabe cómo se implementará. Para salir de la situación en la que está el país hay que hacer mucho, y es temprano para que la nacionalización de YPF muestre efectos positivos", señaló el analista.
El ex secretario de Energía Jorge Lapeña tiene una visión más crítica: "La importación energética es cara y se hace mal. Se pagan grandes sobreprecios por falta de previsión estratégica. La Argentina será por largo tiempo un importador energético masivo. El desafío que el Gobierno no entiende es que sólo le queda un camino: convertirse en un importador eficiente", explicó el ex funcionario.

EL DETALLE DE LA FACTURA

La Argentina recibe gas extranjero por dos vías: desde Bolivia, a través de gasoductos, y desde ultramar, en la forma de gas natural licuado (LNG, de acuerdo con su abreviación en inglés), que llega por barco a los puertos de Bahía Blanca y de Escobar (en ambos casos, la operación está a cargo de YPF y de Enarsa).
Montamat hace un reconocimiento de la conveniencia de comprarle al país vecino. "Como Bolivia nos dio más gas del esperado, la importación de LNG fue menor a la prevista. Sus precios son más bajos. Si no, el encarecimiento de las importaciones hubiese sido mayor", calcula.
En 2012, las compras a ese país crecieron un 65 por ciento en volumen. Aunque la Argentina paga unos 11 dólares por millón de BTU (la unidad de medida), es decir, cuatro veces más que el valor que recibe por la misma cantidad una petrolera local, está por debajo de los alrededor de 17 dólares que el Gobierno gasta para comprar el gas licuado en el exterior.
Ese último ítem es, por mucho, el que más le duele a la balanza energética: en comparación con los números de 2011, las importaciones crecieron el año pasado un 17 por ciento, pero implicaron un desembolso 46% mayor.
El viceministro de Economía, Axel Kicillof, conoce esas cuentas al dedillo. En varias reuniones que mantuvo con empresarios petroleros a fines del año pasado junto a Moreno, les reclamó que le presentaran al Gobierno proyectos para producir más gas, en desmedro de las iniciativas de búsqueda de petróleo, más rentables. Y desde principios de enero el coordinador del Ministerio de Planificación, Roberto Baratta, inició una rueda de invitaciones a empresarios para firmar un acuerdo por el que el Gobierno ofrece pagar US$ 7,50 el millón de BTU (el triple que el precio promedio actual) a cambio de aumentar la producción..
Del editor: por qué es importante.
El Gobierno lleva años negando la crisis energética y desincentivó la producción local. La sequía de divisas es cada vez más importante.




domingo, 10 de febrero de 2013

Ajuste Fiscal en USA


Fiscal policy

The austerity is real




TYLER COWEN is quick to link to pieces calling into question the extent to which austerity plans have been austere. Here is the latest example. He quotes a Washington Post story, which reads:
To sketch the bill’s biggest impacts, The Washington Post focused on the 16 largest individual cuts. Each, in theory, sliced at least $500 million from the federal budget. Together, they accounted for $26.1 billion, two-thirds of the total.
In four of those cases, the real-world impact was difficult to measure. The Department of Homeland Security officially declined to comment about a $557 million reduction. The Department of State, the Department of Agriculture and the Federal Emergency Management Agency — whose cuts totaled $1.9 billion — simply did not answer The Post’s questions despite repeated requests over the past month.
Among the other 12 cases, there were at least seven where the cuts caused only minimal real-world disruptions or none at all.
Often, this was made possible by a little act of Washington magic. Agencies got credit for killing what was, in reality, already dead.
Well, ok. But if this is so, then why is a bank like Goldman Sachs, which has little incentive as far as I can tell to stumble dumbly into rah-rah Keynesianism, warning of an ongoing, significant decline in federal government spending?
Maybe lots of promised cuts turned out to be "cuts". But the record shows that total federal government outlays were 25.2% of GDP in 2009, 24.1% of GDP in 2011, and 22.8% in 2012. (Receipts rose from 15.1% of GDP in 2009 to 15.4% in 2011 to 15.8% in 2012.)
Both outlays and receipts are, as a share of GDP, below pre-crisis levels. And while receipts are now forecast to rise back to pre-crisis level by 2014, outlays are expected to remain about two percentage points higher than before the recession. But the point remains that the "austerity" of 2011-2012 wasn't "austerity" but austerity. Federal government spending fell by a meaningful share of GDP over that period. So did federal government employment, which dropped by 31,000 jobs in 2011 and 45,000 jobs in 2012. What's more, we have good reason to believe that these cuts entailed positive multipliers above those we'd observe in normal times. You don't have to take the IMF's word for it; even stimulus sceptics like Valerie Ramey find that multipliers may sometimes be above normal, and above one, during periods of economic slack.
The cuts may amount to less than initial rhetoric suggested (and who is surprised!). They may not "hurt" in the way small-government types would wish them to hurt, in that meaningful reductions in the resources available to state interests or state-dependent interests have not come to much. But that does not mean that spending hasn't fallen, by a significant amount, with clear impacts for the macroeconomy and those within it who would like to be working but aren't.
Update: A bit more information to make clear that the change in outlay/GDP ratio isn't solely about growth: the CBO indicates that in current-dollar terms total outlays fell from 2011 to 2012 (by about $50 billion). CBO reckons outlays will fall again, also in nominal terms, from 2012 to 2013.

The Economist