Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • feedwordpress 06:17:53 on 2018/11/11 Permalink  

    A Murmuration of Starlings 

    A Murmuration of Starlings

    That's all.

     
  • feedwordpress 03:56:02 on 2018/10/04 Permalink  

    We’re (still) not being alarmist enough about climate change 

    We’re (still) not being alarmist enough about climate change

    What if we had another 9/11, and nothing happened?

    Living in New York City, the one fantasy sport that everybody plays is real estate; we all like to imagine what it would be like to be able to afford to buy a place. And sometime over the last year or two, I realized that, even if I won the lottery and could afford to buy a home in my preferred neighborhood (the Lower East Side), I probably wouldn’t get one in most of the places I'd want to live. Because I think over the 30-year term of that mortgage, our neighborhood will be significantly destabilized by climate change.

    It was a bit of a shock for me to come to this realization, because the logic is extremely straightforward, but I hadn’t really considered the implications at that visceral level. I hadn't yet let the science change my daydreaming about an HGTV future. And as I’ve talked about that reality to more and more people in my circle of friends, a curious pattern has emerged. They’ve all found the rationale around the impacts of climate change unimpeachable, but nearly all are very reluctant to embrace the conclusion that this logic inevitably yields.


    In our neighborhood, it’s easy to see the impacts of increasingly-powerful storms. We were hit hard by Sandy, with power outages for days or weeks, and severe disruptions for months. I still see buildings with the high-water mark outlined on them, and I still remember which places stayed open to serve people in those incredibly dark nights when we didn't even have street lights to show the way. Any day now, they’ll be shutting down the most essential subway line in our neighborhood for massive tunnel repairs that are expected to take years to complete. This is all still recovery from a storm that most of the country has already forgotten about, that the popular memory remembers as "not as bad as they thought it was going to be".

    But if you talk to transit advocates or city council members or the state officials responsible for funding such repairs (and I do), many of them predicate their argument for repairing our subway tunnel on the idea that we’ll be “fixing the problem”. We've got to fix a subway tunnel, right? It’s okay to amortize the cost of repairs to this tunnel over years or decades because then we’ll be in good shape, right?

    I don’t think so.

    We’re still acting like today, and the recent past, is an aberration, and the future will involve things "returning to normal". all making assumptions about when New York City will be hit again by another Sandy-scale storm. Most infrastructure investments are still being made with an assumed “storm of the century” mindset, where it’ll be decades until we confront this kind of disruption again. But it's far more likely, given that the rate of climate change is accelerating, that we’ll see such disruption again within a few years. The big storms we confront in the coming decades won’t always be on the scale of Sandy, but they will hit with far more energy and impact and frequency.

    And when they do, not only will not be ready, we’ll be nowhere near prepared for the rebuilding and reinvestment that it will take to recover. We'll have spent our time and resources on investments that treat extreme climate disruption as the exception, instead of the norm.

    —--

    We're Bad At This

    Much has been written in recent years about how human societies are bad at catching on to creeping threats, as opposed to acute dangers. Western societies in particular seem vulnerable to this, and America at this particular moment seems oriented toward willfully ignoring any long-term trends or obvious threats, in favor of conjuring up imagined dangers. There’s a long strain of anti-intellectualism and short-term profiteering that has led to this point, but the years of effort in undermining science and introducing doubt into the existence of scientific consensus have produced an awful, if inevitable, outcome. Many of our political leaders in power seem shockingly comfortable with encouraging a death cult amongst their followers; this began with normalizing violence but easily evolves into an environment where existential threats are treated as exciting opportunities for a rapturous reckoning, rather than a threat to everyone.

    In the past, we at least were able to treat galvanizing moments of obvious threats as a catalyst for change. For example, we reacted in the extreme to the shock and tragedy of 9/11. Unfortunately, our thoughtless reacion has delivered Bin Laden an almost total victory by embracing nearly every costly, self-defeating tactic possible. But even in losing the war on terror we certainly demonstrated that we were able to use the death of thousands as a motivation to make huge, costly, sweeping changes in society. It’s even possible to imagine what might have happened if we’d responded to the shock of 9/11 with an urgent effort to make positive changes instead of destructive ones.


    This time, though, we had a catastrophe with a far higher death toll, and far higher economic toll, then 9/11, and the regime in power decided to act as if nothing had happened. Puerto Rico's awful fate under Maria was rendered even more horrific by a political response that began with indifference and then degenerated into overt denial. We can almost imagine Trump staring at the smouldering piles of rubble where the Twin Towers had stood, and not merely crowing about how his buildings had moved up in the list of tallest skyscrapers, but actively denying that anyone had died in the World Trade Center at all. Now imagine the rest of us, knowing there was going to be another 9/11 every few years, imagining that it wasn't going to just be us who gets targeted next.

    Beyond Despair

    I know it doesn't sound like it, but I'm an optimist. The reason I love technology and popular culture so much is because I never stop being inspired by what humans create. But I try to be pretty good at seeing where society is heading, and judging where our tastes and trends will take us over time. Usually, that just requires looking at patterns of the past, and learning from that history. This time, I don't think that works.

    There isn't going to be a last-minute reprieve on climate that lets us keep living in the world we used to have.

    Today's political environment demands that scientists still talk about the steps it takes to limit global temperate increase to 1.5° C. That is not going to happen. I don't even think we're going to limit the increase to 2°C within my lifetime. I believe the millions of climate-chased refugees around the world today will be joined by tens of millions more tomorrow . I believe the increasing frequency of sectarian or regional violence instigated by climate-driven disruption of access to water or food will result in more large-scale conflicts. I think governments, even in wealthier or recently stable regions, will be destabilized by the stresses climate disruption places on infrastructure for food, water, transportation, immigration and trade.

    But I do also think some large-scale changes in behavior wil happen faster than we've ever seen in human history. Solar power will gain efficiency and drop in cost at a rate that mimics the progress in smartphones over the last decade. While it'll still be an expensive and resource-intensive effort to create all these solar cells, they'll be able to beat fossil fuels in every regard — including cost — much sooner than people expect, and with far greater impact than we might predict. I'm not quite as bullish on the path for invention and innovation around removing carbon from the environment, but I wouldn't entirely bet against it, either.

    The undermining of the United States' political credibility in the world, and the weakening of its cultural domination over the world, will also yield some benefits in mitigating climate disruption as fewer cultures seek American-style consumption as part of their lifestyles. Not craving giant cars and meat-filled meals will be good for the world, and we're already seeing that shift happen within the United States as well.

    All this could add up to enough to have a huge positive impact in just a few decades. That will, sadly, not be enough to save the millions of lives that we'll see lost to climate disruption in the next half-century. But it's possible that millions of people may still be living in Manhattan in 50 years. I'd put the odds at a little less than 50/50.


    Higher Ground

    I don't know how this plays out. Not a day goes by that I don't grieve for the horrible tragedies my son is going to have to watch unfold during his lifetime because of our collective shortsightedness and failure to act. The reckoning now is whether what's left after all that chaos still resembles the society we have today (yes, even with all its grave and awful injustices) or if the jolt of these changes is too extreme. It's possible that things become so unpredictable and contentious due to climate change that we never find a new political or cultural stability during his lifetime.

    It's hard to believe these things and still have hope, even knowing that our privilege and access and good fortune and talents isolate us from the worst that will come. As a New Yorker who lived here at that time, I still use 9/11 as a reference because it really did change my whole life and my whole perspective. But as the climate evolves, there's a 9/11 every week.

    This year, it's wildfires and hurricanes and typhoons and floods and every single one is a record breaker — until next year. I don't know how to say it to make people understand, this isn't about "this year". This is the rest of our lives. I don't even think it makes sense to talk about preventing climate disruption now. The question is how we move on to preparing for it, for building resiliency into all the institutions and infrastructures that will need to evolve, and how we care for those who are most vulnerable as we keep moving down the path we've chosen.

    Honestly, that thought doesn't depress me (though I understand why for so many, it will). It's simply the work in front of us, the task we have to do. I don't feel hopeless because there's no point to feeling hopeless. We simply have to build a world that keeps working while the one that we have today starts to disappear.

     
  • feedwordpress 18:13:03 on 2018/09/13 Permalink  

    The price of relevance is fluency 

    The price of relevance is fluency

    “You can’t say anything anymore! You can’t even make jokes!”

    There’s a constant complaint from people in positions of power, mostly men, who keep making the ridiculous assertion that they’re not able to speak in public. What they actually mean is they no longer understand the basis of the criticisms they face. And it’s a phenomenon we see from so many people who have a public platform, whether they’re CEOs or comedians or other cultural figures.

    Some of this is a familiar issue: the powerful think that ordinary people have no right to criticize them. There’s nothing new there, and certainly a lot of the dismissive reactions are simply these people thinking that they’re better than their critics, and so don’t have to listen to the pushback. But even those who think they should still be at least pretending to take feedback from the public are mystified by what they’re hearing.

    But there is something new that's also helping cause all this fuss: the rate of change in culture is increasing.


    For some kinds of people, we valorize the breaking of social conventions. In business, it’s called “disruption”, in arts or culture they’ll be called “bad boys” or other similarly ridiculous names for rewarding transgression. Eventually, these rule breakers (who, of course, seldom break the rules of systemic racism or sexism or other structural injustices) find themselves in a position where they have a public voice. They’re onstage, or quoted in the media, and they love the fact that they’re being heard. They bask in the unalloyed adulation of the masses.

    Until recently. All of a sudden, the same things they’ve always said, or something said in private that suddenly becomes public, get a vociferous negative response unlike anything they've ever encountered. Usually, that blowback happens on social media, and these powerful legacy leaders tend to blame the issue on some ineffable negative essence of social networks. They rant about things like "the twitter mob". But that's not the issue at all.

    There Is No "Twitter Mob"

    You see, there is no "Twitter mob", there's only people. And people shape culture, and culture evolves. But in the past, the powerful could keep themselves isolated from the way culture evolves, if they wanted to. Janet Jackson didn't even know what Hot Cheetos are!

    And so, these political leaders and CEOs and comedians and famous-for-being-famous people blather on like they always have, but only now they're faced with the criticisms they've inspired. The criticisms were always there, but the connection of social media to mass media has made them visible.

    Worse, that visibility of critique means that powerful people now have to do work that they didn't want to do. They can't stand it.

    Suddenly, even the most powerful people in society are forced to be fluent in the concerns of those with little power, if they want to hold on to the cultural relevance that thrust them into power in the first place. Being a comedian means having to say things that an audience finds funny; if an audience doesn't find old, hackneyed, abusive jokes funny anymore, then that comedian has to do more work. And what we find is, the comedians with the most privilege resent having to keep working for a living. Wasn't it good enough that they wrote that joke that some people found somewhat funny, some years ago? Why should they have to learn about current culture just to get paid to do comedy?

    Similarly, CEOs keep fussing about how it's hard to not offend people these days. (Being a CEO myself, this one ends up on my radar a lot.) Now, every person in marketing knows they have to try to stay culturally relevant, and certainly every ordinary worker knows they constantly have to be learning new skills and developing professionally. But if a CEO has been in his seat long enough, he'll often get deeply resentful of being told that he has to learn new ways of being respectful to the people who were systematically obstructed from reaching his awareness in the past.

    We can't even count all the stupid ways this plays out, but there are common tropes. The go-to examples of resistance to cultural evolution are always the legacy power-holders resisting the very identity of the communities they excluded. You'll hear awful shit like, "I don't know whether to call them Black or African American, or what?" or terrible "jokes" about how people choose the pronouns that they prefer to be identified with. Now, these powerful folks don't want to be held accountable for disrespecting people with different identities, and the powerful certainly don't want to be mocked for their illiteracy in contemporary culture, but they damn sure want to make certain that you know they're not interested in indulging modern norms for showing respect to others.

    It's not that hard

    Here's the thing, though: It's not that hard. It's not difficult at all to ask people how they want to be identified. It's not tricky to listen to what people are saying about their concerns and their issues, and to try to understand what that means about how culture is evolving. It's not hard at all to be humble about unfamiliar aspects of society and ask for information in respectful ways, then take those responses into consideration going forward.

    And in fact, that's the simple price of continued cultural relevance. If someone wants to maintain power in culture, all that's required is a sincere and honest engagement with those who are granting that power through their attention and support. All it takes is a little bit of curiousity and some basic human decency, and any of us who are blessed with the good fortune to have a platform will get to keep it, and hopefully to use it to make things a little better for others.

    But those in power who have a loud public voice and refuse to adjust and evolve their messages for the modern world will only face increasing resistance, and even actual accountability sometimes — perhaps even in teh form of losing their platforms. And good riddance.

     
  • feedwordpress 19:29:22 on 2018/09/11 Permalink  

    Seventeen is (Almost) Just Another Day 

    Seventeen is (Almost) Just Another Day

    For the first decade after the attacks, I basically didn’t go anywhere near that part of downtown. A business meeting would take me a few blocks away, and I’d feel that tightness in my chest, that presence, and I’d just keep moving.

    But this morning, I’ll walk out of the (newly-opened!) A train access at the Oculus as part of my commute, as I’ve done dozens of times before. Somehow, improbably, its all become routine.

    To be clear, I do still deliberately steer away from the reflecting pools and the memorials. And I give an even wider berth to the tourists drawn to them. (“Which way is 9/11?”) But somehow this place is something more than its ghosts now, and I’m able to be a transit nerd who appreciates a beautiful, if absurd, train station, or to be a person who appreciates a farmer’s market, even as it sprawls just steps away from where I remember seeing that still-smoldering pile of rubble.

    —-

    People don’t even really ask, “Where were you that day?” anymore. For all of the ironic “Never Forget”s, the whole moment has largely faded into history, even as the whole world really was reshaped. There’s a mall there now, a temple to the “go back to shopping” doctrine introduced in those first days of chaos and grief. In the current moment, it’s clearer than ever that those murderous attackers succeeded far beyond their wildest dreams, achieving forms of destruction and destabilization that they probably never even dreamed of.

    But there’s something ordinary, too. If this has become the New Normal, the most unlikely part may be that it’s still a kind of normal.

    —-
    I spent so many years thinking “I can’t go there” that it caught me completely off guard to realize that going there is now routine. Maybe the most charitable way to look at it is resiliency, or that I’m seeing things through the eyes of my child who’s never known any reality but the present one.

    In Past Years

    Each year I write about the attacks on this anniversary, to reflect both on that day and where I'm at right now. I also deeply appreciate the conversation that ensues with those who check in with me every year.

    Last year, Sixteen is Letting Go Again

    So, like ten years ago, I’m letting go. Trying not to project my feelings onto this anniversary, just quietly remembering that morning and how it felt. My son asked me a couple of months ago, “I heard there was another World Trade Center before this one?” and I had to find a version of the story that I could share with him. In this telling, losing those towers was unimaginably sad and showed that there are incredibly hurtful people in the world, but there are still so many good people, and they can make wonderful things together.

    Two years ago, Fifteen is the Past:

    I don’t dismiss or deny that so much has gone so wrong in the response and the reaction that our culture has had since the attacks, but I will not forget or diminish the pure openheartedness I witnessed that day. And I will not let the cynicism or paranoia of others draw me in to join them.
    What I’ve realized, simply, is that 9/11 is in the past now.

    Three years ago, Fourteen is Remembering:

    For the first time, I clearly felt like I had put the attacks firmly in the past. They have loosened their grip on me. I don’t avoid going downtown, or take circuitous routes to avoid seeing where the towers once stood. I can even imagine deliberately visiting the area to see the new train station.

    In 2014, Thirteen is Understanding:

    There’s no part of that day that one should ever have to explain to a child, but I realized for the first time this year that, when the time comes, I’ll be ready. Enough time has passed that I could recite the facts, without simply dissolving into a puddle of my own unresolved questions. I look back at past years, at my own observances of this anniversary, and see how I veered from crushingly sad to fiercely angry to tentatively optimistic, and in each of those moments I was living in one part of what I felt. Maybe I’m ready to see this thing in a bigger picture, or at least from a perspective outside of just myself.

    From 2013, Twelve is Trying:

    I thought in 2001 that some beautiful things could come out of that worst of days, and sure enough, that optimism has often been rewarded. There are boundless examples of kindness and generosity in the worst of circumstances that justify the hope I had for people’s basic decency back then, even if initially my hope was based only on faith and not fact.
    But there is also fatigue. The inevitable fading of outrage and emotional devastation into an overworked rhetorical reference point leaves me exhausted. The decay of a brief, profound moment of unity and reflection into a cheap device to be used to prop up arguments about the ordinary, the everyday and the mundane makes me weary. I’m tired from the effort to protect the fragile memory of something horrific and hopeful that taught me about people at their very best and at their very, very worst.

    In 2012, Eleven is What We Make:

    These are the gifts our children, or all children, give us every day in a million different ways. But they’re also the gifts we give ourselves when we make something meaningful and beautiful. The new World Trade Center buildings are beautiful, in a way that the old ones never were, and in a way that’ll make our fretting over their exorbitant cost seem short-sighted in the decades to come. More importantly, they exist. We made them, together. We raised them in the past eleven years just as surely as we’ve raised our children, with squabbles and mistakes and false starts and slow, inexorable progress toward something beautiful.

    In 2011 for the 10th anniversary, Ten is Love and Everything After:

    I don’t have any profound insights or political commentary to offer that others haven’t already articulated first and better. All that I have is my experience of knowing what it mean to be in New York City then. And from that experience, the biggest lesson I have taken is that I have the obligation to be a kinder man, a more thoughtful man, and someone who lives with as much passion and sincerity as possible. Those are the lessons that I’ll tell my son some day in the distant future, and they’re the ones I want to remember now.

    In 2010, Nine is New New York:

    [T]his is, in many ways, a golden era in the entire history of New York City.
    Over the four hundred years it’s taken for this city to evolve into its current form, there’s never been a better time to walk down the street. Crime is low, without us having sacrificed our personality or passion to get there. We’ve invested in making our sidewalks more walkable, our streets more accommodating of the bikes and buses and taxis that convey us around our town. There’s never been a more vibrant scene in the arts, music or fashion here. And in less than half a decade, the public park where I got married went from a place where I often felt uncomfortable at noontime to one that I wanted to bring together my closest friends and family on the best day of my life. We still struggle with radical inequality, but more people interact with people from broadly different social classes and cultures every day in New York than any other place in America, and possibly than in any other city in the world.
    And all of this happened, by choice, in the years since the attacks.

    In 2009, Eight Is Starting Over:

    [T]his year, I am much more at peace. It may be that, finally, we’ve been called on by our leadership to mark this day by being of service to our communities, our country, and our fellow humans. I’ve been trying of late to do exactly that. And I’ve had a bit of a realization about how my own life was changed by that day.
    Speaking to my mother last week, I offhandedly mentioned how almost all of my friends and acquaintances, my entire career and my accomplishments, my ambitions and hopes have all been born since September 11, 2001. If you’ll pardon the geeky reference, it’s as if my life was rebooted that day and in the short period afterwards. While I have a handful of lifelong friends with whom I’ve stayed in touch, most of the people I’m closest to are those who were with me on the day of the attacks or shortly thereafter, and the goals I have for myself are those which I formed in the next days and weeks. i don’t think it’s coincidence that I was introduced to my wife while the wreckage at the site of the towers was still smoldering, or that I resolved to have my life’s work amount to something meaningful while my beloved city was still papered with signs mourning the missing.

    In 2008, Seven Is Angry:

    Finally getting angry myself, I realize that nobody has more right to claim authority over the legacy of the attacks than the people of New York. And yet, I don’t see survivors of the attacks downtown claiming the exclusive right to represent the noble ambition of Never Forgetting. I’m not saying that people never mention the attacks here in New York, but there’s a genuine awareness that, if you use the attacks as justification for your position, the person you’re addressing may well have lost more than you that day. As I write this, I know that parked out front is the car of a woman who works in my neighborhood. Her car has a simple but striking memorial on it, listing her mother’s name, date of birth, and the date 9/11/2001.

    In 2007, Six Is Letting Go:

    On the afternoon of September 11th, 2001, and especially on September 12th, I wasn’t only sad. I was also hopeful. I wanted to believe that we wouldn’t just Never Forget that we would also Always Remember. People were already insisting that we’d put aside our differences and come together, and maybe the part that I’m most bittersweet and wistful about was that I really believed it. I’d turned 26 years old just a few days before the attacks, and I realize in retrospect that maybe that moment, as I eased from my mid-twenties to my late twenties, was the last time I’d be unabashedly optimistic about something, even amidst all the sorrow.

    In 2006, After Five Years, Failure:

    [O]ne of the strongest feelings I came away with on the day of the attacks was a feeling of some kind of hope. Being in New York that day really showed me the best that people can be. As much as it’s become cliché now, there’s simply no other way to describe a display that profound. It was truly a case of people showing their very best nature.
    We seem to have let the hope of that day go, though.

    In 2005, Four Years:

    I saw people who hated New York City, or at least didn’t care very much about it, trying to act as if they were extremely invested in recovering from the attacks, or opining about the causes or effects of the attacks. And to me, my memory of the attacks and, especially, the days afterward had nothing to do with the geopolitics of the situation. They were about a real human tragedy, and about the people who were there and affected, and about everything but placing blame and pointing fingers. It felt thoughtless for everyone to offer their response in a framework that didn’t honor the people who were actually going through the event.

    In 2004, Thinking Of You:

    I don’t know if it’s distance, or just the passing of time, but I notice how muted the sorrow is. There’s a passivity, a lack of passion to the observances. I knew it would come, in the same way that a friend told me quite presciently that day back in 2001 that “this is all going to be political debates someday” and, well, someday’s already here.

    In 2003, Two Years:

    I spent a lot of time, too much time, resenting people who were visiting our city, and especially the site of the attacks, these past two years. I’ve been so protective, I didn’t want them to come and get their picture taken like it was Cinderella’s Castle or something. I’m trying really hard not to be so angry about that these days. I found that being angry kept me from doing the productive and important things that really mattered, and kept me from living a life that I know I’m lucky to have.

    In 2002, I wrote On Being An American:

    [I]n those first weeks, I thought a lot about what it is to be American. That a lot of people outside of New York City might not even recognize their own country if they came to visit. The America that was attacked a year ago was an America where people are as likely to have been born outside the borders of the U.S. as not. Where most of the residents speak another language in addition to English. Where the soundtrack is, yes, jazz and blues and rock and roll, but also hip hop and salsa and merengue. New York has always been where the first fine threads of new cultures work their way into the fabric of America, and the city the bore the brunt of those attacks last September reflected that ideal to its fullest.

    In 2001, Thank You:

    I am physically fine, as are all my family members and immediate friends. I’ve been watching the footage all morning, I can’t believe I watched the World Trade Center collapse…
    I’ve been sitting here this whole morning, choking back tears… this is just too much, too big. I can see the smoke and ash from the street here. I have friends of friends who work there, I was just there myself the day before yesterday. I can’t process this all. I don’t want to.

     
  • feedwordpress 14:11:02 on 2018/08/03 Permalink
    Tags: , iphone,   

    A Much Faster Way to Charge your iPhone 

    A Much Faster Way to Charge your iPhone

    Forgive me, for I am about to commit gadget blogging. I've been using an iPhone X since they came out, and almost from the start my battery has charged between two and three time the default speed of most people's phones. All you need is one new cable to do it.

    The short version: The iPhone X (and iPhone 8) supports a fast-charging mode. If you spend a little bit of money on a higher-wattage charger, you can fill up your battery much faster, especially when it's really low.

    Here's what the results look like with the fast-charging setup that I've got now, starting from a phone that was down to only 2% battery charge:

    Time (minutes) 0 4 20 75 90 120
    Charge 2% 10% 37% 83% 91% 97%

    Underpowered

    It's a strange way of pinching pennies on a phone that costs a thousand dollars, but Apple still includes their chintzy little square phone charger with every iPhone sold. It's barely changed over the last decade, and puts out a paltry 5 watts. On the plus side, it is small and doesn't block adjacent outlets, which I suppose is nice for people who are short on space.

    But here's the terrible part: If the regular 5W charger is the only charger you use with an iPhone X, and your battery is running really low, plugging in for half an hour will only add about 20% to your battery. You'll still be in Low Power Mode after half an hour! That is no good.

    As always with Apple, the solution is to spend money. It's more money than you want to spend, but enough that we'll all just suck it up and pay. Yay, Apple!


    Overkill

    I've ended up with a solution that is, admittedly, overkill. Through an absurd series of events, I ended up with an extra of Apple's most expensive charger: The 87 Watt USB-C Power Adapter. This is the most powerful laptop charger Apple sells, using its latest USB-C connector. Only the top-of-the-line 15" MacBook Pro even comes with this kind of power supply; the rest of the MacBooks make do with 61 Watts or less — the regular MacBook only comes with a 29 Watt charger! But each of these chargers uses USB-C, the new universal cable that's both enticingly simple and infuriatingly prone to unpredictable incompatibilties around power and data capabilities. The bottom line is, you don't need the ridiculously high-powered Macbook charger because any of the Apple USB-C chargers can do the job. If you don't have any of these USB-C chargers, I've heard good things about Anker's 30W power plug.

    Once you've got one of these power bricks, or an extra USB-C port on your MacBook or iMac, it's time for the key step: grab Apple's USB-C to Lightning cable, which is frustratingly overpriced at twenty bucks, but worth it. (Amazingly, this represents a 25% price drop for this particular cable over last year's prices.)


    That's it. Plug in your pricey new USB-C to Lightning cable, and you'll be topping up your iPhone battery much faster. Of course, it matters most if your charge level gets really low — if you're already at 95%, none of these products will make much of a difference for getting to 100%.

    I've also used a number of the popular wireless charging (Qi) devices that the iPhone X and iPhone 8 support, and while they certainly work, they're really slow. I much prefer the speed of having my battery fill faster to the convenience of not having to plug in a cable, unless I'm at a public/shared charge point like at a coffee shop or airport.

    The rumors are that Apple is going to include the USB-C to Lightning cable along with the next generation of iPhones, and they certainly should. The default experience for people buying a top-of-the-line phone ought to be the fastest charging experience possible.

    A Much Faster Way to Charge your iPhone

     
  • feedwordpress 15:41:09 on 2018/07/13 Permalink  

    Unfollowing Everybody 

    Unfollowing Everybody

    At this point, there's nothing novel about noticing that social media is often toxic and stressful. But even aside from those concerns, our social networks are not things we generally think of as requiring maintenance or upkeep, even though we routinely do regular updates on all the other aspects of our digital lives.

    Keeping in mind that spirit of doing necessary maintenance, I recently did something I'd thought about doing for years: I unfollowed everyone on Twitter. Now, these kinds of decisions are oddly fraught; a lot of people see their following relationships on social media as a form of status, not merely an indication of where information is flowing between people. But I decided to assume that the people I'm connected to know that me unfollowing everyone isn't personal, but really just a response to the overwhelming noise of having more than 5000 accounts sharing info with me on a single network.


    How I did it

    Okay, this part is gonna get slightly geeky, if you're not a coder, but I thought I'd explain the process in case anyone wants to repeat it.

    Years ago, Twitter used to have a command-line interface for performing bulk or automated actions on an account. They abandoned it after a while, so Erik Berlin created a new command-line tool for power users of Twitter, simply called "t". It's written in Ruby (a language I basically can read but not really write) so it's easy enough to get running if you follow the few simple setup steps.

    As Erik mentions in that documentation, you'll then need to set up a new Twitter app on your account, and get the credentials that will let the t tool perform actions on your Twitter account. (Note: I got some errors while updating and authenticating; making these edits to one of the ruby libraries that t depends on fixed the issue immediately.)

    The Plan

    At that point, I wanted to follow a few simple steps. These took a little longer for me because I was following over 5,000 people on Twitter, but if you're following a more reasonable number, none of these steps should take more than a few minutes to complete. This was my plan:

    1. Copy all the people I was following to a Twitter list, so I could still access them in my Twitter apps on all my devices, and I could still see my old timeline at any point if I wanted to.
    2. Archive all of the people I was following into a spreadsheet, so I could sort through them and filter for geography or how many followers they have or whether they were verified or not — basically any criteria that might be interesting when deciding who to follow (or not follow).
    3. Actually unfollow everybody and start over.

    As it turns out, each of these steps is pretty easy.

    Copying all your followers to a List

    If you want to back up all of your followers, you only need to make a list and then populate it. You can make lists in most regular Twitter apps, but to do it at the command line it's simple: type in t list create following-date "+%Y-%m-%d" to make a list named after the current date, so you can easily remember this was a list of who you were following as of today. You can pretty easily understand the t syntax here — commands like list create are pretty self-explanatory.

    Next, we have a slightly more elaborate command to copy all of your followers to the new list; you'll dump out a list of everyone you follow, and then pipe that into another t command to add them to your new list. It works like so: t followings | xargs t list add following-date "+%Y-%m-%d". (If you're like me, you'll be doing all this stuff around midnight, and the date will change in the middle of it, and you should just type in the current date instead of using date variables.)

    That's it! Now you've got a list of all your followers, and if you browse that list in your Twitter client app, you should see the exact same thing as your regular timeline. Do note, though, that Twitter lists don't function well with more than a few thousand followers. It took hours for all 5,000+ of my followers to show up on the list, and in the interim the counts of how many people belonged to the list were often incorrect.

    Archiving your followers into a spreadsheet

    This one is just a fun thing to do in general, if you like to slice and dice data about your social network. t supports exporting a pretty broad set of data about your followers, not just their names and Twitter handles, by allowing for a "long format" export with complete data. You get stuff like how many favorites (likes) they have on Twitter, when their account was created, and how many people they follow or are followed by. Frustratingly, Twitter no longer makes it easy for this data export to include whether that person follows you or not; that requires an additional query.

    You'll use CSV (comma-separated values) as the format for exporting your data into a spreadsheet. And good news! t supports that natively. So your command will look like this: t followings -l --csv > followings.csv which basically says "Export my followings, in long format, to a CSV file named 'followings.csv'." Once you do that, you can open it up in Excel or Google Sheets in a few clicks, and you're all set.

    Unfollowing Everybody

    After all the people I followed were in a spreadsheet, I was able to sort by how many followers or followings they had, and also their last update, and I found friends who'd passed away whose accounts had been dormant for years, or joke accounts whose relevance had expired, or quiet voices with small networks that had been drowned out amongst the cacophany of the many other voices I was hearing each day. I found this part to be a really worthwhile exercise, and definitely decided to follow fewer people with huge networks and lots of reach.

    Actually unfollowing!

    Then, it was time for the main event: actually unfollowing everybody. I don't think this will be as much of a problem for other folks, but trying to run a single process of unfollowing everybody had me repeatedly running into Twitter's rate limits, where they try to keep any app from performing too many actions on your account in too short a period of time. I ended up writing a simple script to do the unfollowing in batches, then pausing for a few minutes, then starting up again.

    But with a more reasonable network, the command to unfollow everyone is extremely simple:

    t followings | xargs t unfollow

    It'll chug away for a few minutes, and then that's it! You're not following anybody anymore. Except it might still look like you are.

    In my case, my follower count was wrong for days, and kept showing wildly inaccurate information like insisting that I was following one of Mike Pence's official accounts. (Needless to say, that was never the case.) All of this is due to a architectural decision called eventual consistency, which helps enable Twitter to scale to its massive size, but doesn't do as good a job of handling unusual circumstances like being able to immediately see the correct list of followers for someone who has just unfollowed thousands of accounts.

    Nevertheless, the deed was done. I refollowed a few essential accounts (my family, @Glitch and @Prince, and was ready to start anew.

    Lessons learned

    It's been about a week and a half, and, well... Twitter is a lot more pleasant. I've chosen a handful of accounts to follow each day (most ones that I followed before, some entirely new to me) and it's made a big difference. On the flip side, about 100 people seem to have unfollowed me after I unfollowed everybody, and I hope they hadn't felt obligated just to reciprocate if I was following them before. (That might also just be how many people unfollow me in a given week, I dunno.)

    One of the most immediate benefits is that, when something terrible happens in the news, I don't see an endless, repetitive stream of dozens of people reacting to it in succession. It turns out, I don't mind knowing about current events, but it hurts to see lots of people I care about going through anguish or pain when bad news happens. I want to optimize for being aware, but not emotionally overwhelmed.

    To that point, I've also basically not refollowed any news accounts or "official" corporate accounts. Anything I need to know about major headlines gets surfaced through other channels, or even just other parts of Twitter, so I don't need to see social media updates from media companies whose entire economic model is predicated on causing me enough stress to click through to their sites.

    Similarly, I've focused a lot more on artists and activists and people who write about the stuff I'm obsessed with in general — Prince or mangoes or urban transit or the like. That brings a lot more joy into my life, and people writing about these other topics offer alot more inspiration for the things I want to be focused on. Oddly, given that my job is being the CEO of a tech company, I follow far fewer people in tech, and almost no tech company accounts except for my own. Despite that, I've missed almost nothing significant in the industry since making this change.

    The algorithm is learning

    Most interesting to me is how the suggested content and accounts on Twitter have changed since I changed my network. Before, much of the suggested headlines or featured Tweets in my Twitter apps would be from categories like "Technology VC"; now they're much more likely to be about "Climate Change" or "Comedians" than about inside-baseball tech talk.

    On the less positive side, Twitter still suggests that I follow accounts that are almost entirely men, and overwhelmingly white American men with verified Twitter accounts. This is bizarre to me as I'm now following nearly 100 accounts, and they're basically the same mix of races and genders and geographies that I've always been interested in hearing from. I would have expected Twitter's follow-suggestion algorithm to be at least as adaptive as its content-suggestion one, and hope that it'll get updated to feature accounts that don't fit the usual privileged patterns. (I do still follow a lot of verified accounts, but some of that is due to an oddity I've just realized, which is that a lot of my friends have verified accounts. Look, ma — I'm a big-city elite!)

    What Follows

    I don't have some grand takeaway about what all this means; obviously, I've been thinking about the design and impacts and best use of social networks on the web for basically as long as they've existed. I strongly believe we should be intentional in how we use our networks, and even spent years building tools to encourage that, though the corporate interest of the major social networks precludes building a business around encouraging healthier use of their platforms.

    But I'm happy for making a conscious decision about managing my network, and I lament that it takes a pretty extreme level of technical knowledge to be able to do so. I first wrote about Twitter when it was only a few months old, talking about its promise and predicting that Twitter would adopt @messaging and adapt to other ways its community was inventing new behaviors. Some of that happened, but of course most of what power users (and vulnerable users) wanted was never created.

    I've also written a good bit about the peculiarities of having a large network in social media, like Twitter's early practice of suggesting which accounts to follow (including mine!) and what it's like to have the social network of a famous person without actually being famous. I think a lot about why I "favorite" (or like) so many things on various networks. And I also hope people can think more broadly about the ways the design of social networks intersects with how we see ourselves, and how we see social status, as best exemplified by the huge social anxieties around what it's like being verified on Twitter.

    And ultimately, I come back to what I wrote a few years ago when I first decided to stop retweeting men (a practice I've followed for about half a decade now):

    If you’re inclined, try being mindful of whose voices you share, amplify, validate and promote to others.

    It's still a really important point, and to this list I would only add: Also be mindful about who you follow. And don't be afraid sometimes to reset and start over.

     
  • feedwordpress 02:09:01 on 2018/05/11 Permalink
    Tags: cartoons   

    The Cartoon Kit 

    The Cartoon Kit

    Anything worth doing is worth doing meta. And Tom and Jerry is no exception.

    I've been trying to learn a bit more about the various eras of the Tom and Jerry cartoon, from the mega-racist Hanna-Barbera originals to the extremely stylized Chuck Jones episodes.

    Somewhere in the middle are the truly odd Gene Deitch-directed Tom and Jerry cartoons, where Deitch criticized the violence and monotony of the cartoons using the cartoons themselves.

    This self-critique reached its apotheosis with The Tom and Jerry Cartoon Kit from 1962. In it, Deitch spells out the formula for a rote cartoon while deconstructing it. I'd only ever seen this once as a kid, on TV at a friend's house, but it left an impression as if I'd seen it in a full-size theater during its original presetnation.

    Here, see it for yourself. It's less than 7 minutes of your time, and has aged surprisingly well.

     
  • feedwordpress 14:21:48 on 2018/05/02 Permalink
    Tags: ,   

    It’s like Shazam — for your face! 

    It's like Shazam — for your face!

    Today's most fun new Glitch app is Record Player, which lets you upload a photo, then uses Google Cloud's Vision API to recognize the image and play it on Spotify.

    It works really well, but the real fun starts when you upload a selfie or a picture of yourself.


    I especially love that this was made by Patrick Weaver on the Mouse.org team, because it makes me think that kids learning computer science from Mouse curriculum here in NYC are going to start by seeing tech as enabling fun apps like this one! And you can, of course, View Source for the app and remix it to make your own variations.

     
  • feedwordpress 00:19:34 on 2018/04/08 Permalink
    Tags: ,   

    12 Things Everyone Should Understand About Tech 

    12 Things Everyone Should Understand About Tech

    Tech is more important than ever, deeply affecting culture, politics and society. Given all the time we spend with our gadgets and apps, it’s essential to understand the principles that determine how tech affects our lives.

    Understanding technology today

    Technology isn’t an industry, it’s a method of transforming the culture and economics of existing systems and institutions. That can be a little bit hard to understand if we only judge tech as a set of consumer products that we purchase. But tech goes a lot deeper than the phones in our hands, and we must understand some fundamental shifts in society if we’re going to make good decisions about the way tech companies shape our lives—and especially if we want to influence the people who actually make technology.

    Even those of us who have been deeply immersed in the tech world for a long time can miss the driving forces that shape its impact. So here, we’ll identify some key principles that can help us understand technology’s place in culture.

    What you need to know:

    1. Tech is not neutral.

    One of the most important things everybody should know about the apps and services they use is that the values of technology creators are deeply ingrained in every button, every link, and every glowing icon that we see. Choices that software developers make about design, technical architecture or business model can have profound impacts on our privacy, security and even civil rights as users. When software encourages us to take photos that are square instead of rectangular, or to put an always-on microphone in our living rooms, or to be reachable by our bosses at any moment, it changes our behaviors, and it changes our lives.

    All of the changes in our lives that happen when we use new technologies do so according to the priorities and preferences of those who create those technologies.

    2. Tech is not inevitable.

    Popular culture presents consumer technology as a never-ending upward progression that continuously makes things better for everybody. In reality, new tech products usually involve a set of tradeoffs where improvements in areas like usability or design come along with weaknesses in areas like privacy & security. Sometimes new tech is better for one community while making things worse for others. Most importantly, just because a particular technology is “better” in some way doesn’t guarantee it will be widely adopted, or that it will cause other, more popular technologies to improve.
    In reality, technological advances are a lot like evolution in the biological world: there are all kinds of dead-ends or regressions or uneven tradeoffs along the way, even if we see broad progress over time.

    3. Most people in tech sincerely want to do good.

    We can be thoughtfully skeptical and critical of modern tech products and companies without having to believe that most people who create tech are “bad”. Having met tens of thousands of people around the world who create hardware and software, I can attest that the cliché that they want to change the world for the better is a sincere one. Tech creators are very earnest about wanting to have a positive impact. At the same time, it’s important for those who make tech to understand that good intentions don’t absolve them from being responsible for the negative consequences of their work, no matter how well-intentioned.

    It’s useful to acknowledge the good intentions of most people in tech because it lets us follow through on those intentions and reduce the influence of those who don’t have good intentions, and to make sure the stereotype of the thoughtless tech bro doesn’t overshadow the impact that the majority of thoughtful, conscientious people can have. It’s also essential to believe that there is good intention underlying most tech efforts if we’re going to effectively hold everyone accountable for the tech they create.

    12 Things Everyone Should Understand About Tech

    4. Tech history is poorly documented and poorly understood.

    People who learn to create tech can usually find out every intimate detail of how their favorite programming language or device was created, but it’s often near impossible to know why certain technologies flourished, or what happened to the ones that didn’t. While we’re still early enough in the computing revolution that many of its pioneers are still alive and working to create technology today, it’s common to find that tech history as recent as a few years ago has already been erased. Why did your favorite app succeed when others didn’t? What failed attempts were made to create such apps before? What problems did those apps encounter — or what problems did they cause? Which creators or innovators got erased from the stories when we created the myths around today’s biggest tech titans?

    All of those questions get glossed over, silenced, or sometimes deliberately answered incorrectly, in favor of building a story of sleek, seamless, inevitable progress in the tech world. Now, that’s hardly unique to technology — nearly every industry can point to similar issues. But that ahistorical view of the tech world can have serious consequences when today’s tech creators are unable to learn from those who came before them, even if they want to.

    5. Most tech education doesn’t include ethical training.

    In mature disciplines like law or medicine, we often see centuries of learning incorporated into the professional curriculum, with explicit requirements for ethical education. Now, that hardly stops ethical transgressions from happening—we can see deeply unethical people in positions of power today who went to top business schools that proudly tout their vaunted ethics programs. But that basic level of familiarity with ethical concerns gives those fields a broad fluency in the concepts of ethics so they can have informed conversations. And more importantly, it ensures that those who want to do the right thing and do their jobs in an ethical way have a firm foundation to build on.

    But until the very recent backlash against some of the worst excesses of the tech world, there had been little progress in increasing the expectation of ethical education being incorporated into technical training. There are still very few programs aimed at upgrading the ethical knowledge of those who are already in the workforce; continuing education is largely focused on acquiring new technical skills rather than social ones. There’s no silver-bullet solution to this issue; it’s overly simplistic to think that simply bringing computer scientists into closer collaboration with liberal arts majors will significantly address these ethics concerns. But it is clear that technologists will have to rapidly become fluent in ethical concerns if they want to continue to have the widespread public support that they currently enjoy.

    6. Tech is often built with surprising ignorance about its users.

    Over the last few decades, society has greatly increased in its respect for the tech industry, but this has often resulted in treating the people who create tech as infallible. Tech creators now regularly get treated as authorities in a wide range of fields like media, labor, transportation, infrastructure and political policy — even if they have no background in those areas. But knowing how to make an iPhone app doesn’t mean you understand an industry you’ve never worked in!

    The best, most thoughtful tech creators engage deeply and sincerely with the communities that they want to help, to ensure they address actual needs rather than indiscriminately “disrupting” the way established systems work. But sometimes, new technologies run roughshod over these communities, and the people making those technologies have enough financial and social resources that the shortcomings of their approaches don’t keep them from disrupting the balance of an ecosystem. Often times, tech creators have enough money funding them that they don’t even notice the negative effects of the flaws in their designs, especially if they’re isolated from the people affected by those flaws. Making all of this worse are the problems with inclusion in the tech industry, which mean that many of the most vulnerable communities will have little or no representation amongst the teams that create new tech, preventing those teams from being aware of concerns that might be of particular importance to those on the margins.

    12 Things Everyone Should Understand About Tech

    7. There is never just one single genius creator of technology.

    One of the most popular representations of technology innovation in popular culture is the genius in a dorm room or garage, coming up with a breakthrough innovation as a “Eureka!” moment. It feeds the common myth-making around people like Steve Jobs, where one individual gets credit for “inventing the iPhone” when it was the work of thousands of people. In reality, tech is always informed by the insights and values of the community where its creators are based, and nearly every breakthrough moment is preceded by years or decades of others trying to create similar products.

    The “lone creator” myth is particularly destructive because it exacerbates the exclusion problems which plague the tech industry overall; those lone geniuses that are portrayed in media are seldom from backgrounds as diverse as people in real communities. While media outlets may benefit from being able to give awards or recognition to individuals, or educational institutions may be motivated to build up the mythology of individuals in order to bask in their reflected glory, the real creation stories are complicated and involve many people. We should be powerfully skeptical of any narratives that indicate otherwise.

    8. Most tech isn’t from startups or by startups.

    Only about 15% of programmers work at startups, and in many big tech companies, most of the staff aren’t even programmers anyway. So the focus on defining tech by the habits or culture of programmers that work at big-name startups deeply distorts the way that tech is seen in society. Instead, we should consider that the majority of people who create technology work in organizations or institutions that we don’t think of as “tech” at all.
    What’s more, there are lots of independent tech companies — little indie shops or mom-and-pop businesses that make websites, apps, or custom software, and a lot of the most talented programmers prefer the culture or challenges of those organizations over the more famous tech titans. We shouldn’t erase the fact that startups are only a tiny part of tech, and we shouldn’t let the extreme culture of many startups distort the way we think about technology overall.

    9. Most big tech companies make money in just one of three ways.

    It’s important to understand how tech companies make money if you want to understand why tech works the way that it does.

    • Advertising: Google and Facebook make nearly all of their money from selling information about you to advertisers. Almost every product they create is designed to extract as much information from you as possible, so that it can be used to create a more detailed profile of your behaviors and preferences, and the search results and social feeds made by advertising companies are strongly incentivized to push you toward sites or apps that show you more ads from these platforms. It’s a business model built around surveillance, which is particularly striking since it’s the one that most consumer internet businesses rely upon.
    • Big Business: Some of the larger (generally more boring) tech companies like Microsoft and Oracle and Salesforce exist to get money from other big companies that need business software but will pay a premium if it’s easy to manage and easy to lock down the ways that employees use it. Very little of this technology is a delight to use, especially because the customers for it are obsessed with controlling and monitoring their workers, but these are some of the most profitable companies in tech.
    • Individuals: Companies like Apple and Amazon want you to pay them directly for their products, or for the products that others sell in their store. (Although Amazon’s Web Services exist to serve that Big Business market, above.) This is one of the most straightforward business models—you know exactly what you’re getting when you buy an iPhone or a Kindle, or when you subscribe to Spotify, and because it doesn’t rely on advertising or cede purchasing control to your employer, companies with this model tend to be the ones where individual people have the most power.

    That’s it. Pretty much every company in tech is trying to do one of those three things, and you can understand why they make their choices by seeing how it connects to these three business models

    12 Things Everyone Should Understand About Tech

    10. The economic model of big companies skews all of tech.

    Today’s biggest tech companies follow a simple formula:

    • Make an interesting or useful product that transforms a big market
    • Get lots of money from venture capital investors
    • Try to quickly grow a huge audience of users even if that means losing a lot of money for a while
    • Figure out how to turn that huge audience into a business worth enough to give investors an enormous return
    • Start ferociously fighting (or buying off) other competitive companies in the market

    This model looks very different than how we think of traditional growth companies, which start off as small businesses and primarily grow through attracting customers who directly pay for goods or services. Companies that follow this new model can grow much larger, much more quickly, than older companies that had to rely on revenue growth from paying customers. But these new companies also have much lower accountability to the markets they’re entering because they’re serving their investors’ short-term interests ahead of their users’ or community’s long-term interests.

    The pervasiveness of this kind of business plan can make competition almost impossible for companies without venture capital investment. Regular companies that grow based on earning money from customers can’t afford to lose that much money for that long a time. It’s not a level playing field, which often means that companies are stuck being either little indie efforts or giant monstrous behemoths, with very little in between. The end result looks a lot like the movie industry, where there are tiny indie arthouse films and big superhero blockbusters, and not very much else.

    And the biggest cost for these big new tech companies? Hiring coders. They pump the vast majority of their investment money into hiring and retaining the programmers who’ll build their new tech platforms. Precious little of these enormous piles of money are put into things that will serve a community or build equity for anyone other than the founders or investors in the company. There is no aspiration that making a hugely valuable company should also imply creating lots of jobs for lots of different kinds of people.

    11. Tech is as much about fashion as function.

    To outsiders, creating apps or devices is presented as a hyper-rational process where engineers choose technologies based on which are the most advanced and appropriate to the task. In reality, the choice of things like programming languages or toolkits can be subject to the whims of particular coders or managers, or to whatever’s simply in fashion. Just as often, the process or methodology by which tech is created can follow fads or trends that are in fashion, affecting everything from how meetings are run to how products are developed.

    Sometimes the people creating technology seek novelty, sometimes they want to go back to the staples of their technological wardrobe, but these choices are swayed by social factors in addition to an objective assessment of technical merit. And a more complex technology doesn’t always equal a more valuable end product, so while many companies like to tout how ambitious or cutting-edge their new technologies are, that’s no guarantee that they provide more value for regular users, especially when new technologies inevitably come with new bugs and unexpected side-effects.

    12. No institution has the power to rein in tech’s abuses.

    In most industries, if companies start doing something wrong or exploiting consumers, they’ll be reined in by journalists who will investigate and criticize their actions. Then, if the abuses continue and become serious enough, the companies can be sanctioned by lawmakers at the local, state, governmental or international level.

    Today, though, much of the tech trade press focuses on covering the launch of new products or new versions of existing products, and the tech reporters who do cover the important social impacts of tech are often relegated to being published alongside reviews of new phones, instead of being prominently featured in business or culture coverage. Though this has started to change as tech companies have become absurdly wealthy and powerful, coverage is also still constrained by the culture within media companies. Traditional business reporters often have seniority in major media outlets, but are commonly illiterate in basic tech concepts in a way that would be unthinkable for journalists who cover finance or law. Meanwhile, dedicated tech reporters who may have a better understanding of tech’s impact on culture are often assigned to (or inclined to) cover product announcements instead of broader civic or social concerns.

    The problem is far more serious when we consider regulators and elected officials, who often brag about their illiteracy about tech. Having political leaders who can’t even install an app on their smartphones makes it impossible to understand technology well enough to regulate it appropriately, or to assign legal accountability when tech‘s creators violate the law. Even as technology opens up new challenges for society, lawmakers lag tremendously behind the state of the art when creating appropriate laws.

    Without the corrective force of journalistic and legislative accountability, tech companies often run as if they’re completely unregulated, and the consequences of that reality usually fall on those outside of tech. Worse, traditional activists who rely on conventional methods such as boycotts or protests often find themselves ineffective due to the indirect business model of giant tech companies, which can rely on advertising or surveillance (“gathering user data”) or venture capital investment to continue operations even if activists are effective in identifying problems.

    This lack of systems of accountability is one of the biggest challenges facing tech today.


    If we understand these things, we can change tech for the better.

    If everything is so complicated, and so many important points about tech aren’t obvious, should we just give up hope? No.

    Once we know the forces that shape technology, we can start to drive change. If we know that the biggest cost for the tech giants is attracting and hiring programmers, we can encourage programmers to collectively advocate for ethical and social advances from their employers. If we know that the investors who power big companies respond to potential risks in the market, we can emphasize that their investment risk increases if they bet on companies that act in ways that are bad for society.

    If we understand that most in tech mean well, but lack the historic or cultural context to ensure that their impact is as good as their intentions, we can ensure that they get the knowledge they need to prevent harm before it happens.

    So many of us who create technology, or who love the ways it empowers us and improves our lives, are struggling with the many negative effects that some of these same technologies are having on society. But perhaps if we start from a set of common principles that help us understand how tech truly works, we can start to tackle technology’s biggest problems.

    12 Things Everyone Should Understand About Tech

     
  • feedwordpress 17:00:00 on 2018/03/22 Permalink
    Tags: html, ,   

    The Missing Building Blocks of the Web 

    The Missing Building Blocks of the Web

    At a time when millions are losing trust in the the web’s biggest sites, it’s worth revisiting the idea that the web was supposed to be made out of countless little sites. Here’s a look at the neglected technologies that were supposed to make it possible.

    Though the world wide web has been around for more than a quarter century, people have been theorizing about hypertext and linked documents and a global network of apps for at least 75 years, and perhaps longer. And while some of those ideas are now obsolete, or were hopelessly academic as concepts, or seem incredibly obvious in a world where we’re all on the web every day, the time is perfect to revisit a few of the overlooked gems from past eras. Perhaps modern versions of these concepts could be what helps us rebuild the web into something that has the potential, excitement, and openness that got so many of us excited about it in the first place.

    [An aside: Our team at Glitch has been hard at work on delivering many of the core ideas discussed in this piece, including new approaches to View Source, Authoring, Embedding, and more. If these ideas resonate with you, we hope you’ll check out Glitch and see how we can bring these abilities back to the web.]

    View Source

    For the first few years of the web, the fundamental way that people learned to build web pages was by using the “View Source” feature in their web browser. You would point your mouse at a menu that said something like “View Source” (nobody was browsing the web on a touchscreen back then) and suddenly you’d see the HTML code that made up the page you were looking at. If you squinted, you could see the text you’d been reading, and wrapped around it was a fairly comprehensible set of tags — you know, that <p>paragraph</p> kind of stuff.

    It was one of the most effective technology teaching tools ever created. And no surprise, since the web was invented for the purpose of sharing knowledge.

    These days, View Source is in bad shape. Most mobile devices don’t support the feature at all. And even on the desktop, the feature gets buried away, or hidden unless you enable special developer settings. It’s especially egregious because the tools for working with HTML in a browser are better than ever. Developers have basically given ordinary desktop web browsers the potential to be smart, powerful tools for creating web pages.

    But that leads to the other problem. Most complicated web pages these days aren’t actually written by anyone. They’re assembled, by little programs that take the instructions made by a coder, and then translate those instructions into the actual HTML (and CSS, and JavaScript, and images, and everything else) that goes to your browser. If you’re an expert, maybe you can figure out what tools were being used to assemble the page, and go to GitHub and find some version of those tools to try out. But it’s the difference between learning to cook by looking over someone’s shoulder or being told where a restaurant bought its ingredients.

    Bringing View Source back could empower a new generation of creators to see the web as something they make, not just a place where big companies put up sites that we all dump our personal data into.

    The Missing Building Blocks of the Web

    Authoring

    When Tim Berners-Lee invented the world wide web, he assumed that, just like in earlier hypertext systems, every web browser would be able to write web pages just as easily as it read them. In fact, that early belief led many who pioneered the web to assume that the format of HTML itself didn’t matter that much, as many different browsing tools would be able to create it.

    In some ways, that’s true — billions of people make things on the web all the time. Only they don’t know they’re making HTML, because Facebook (or Instagram, or whatever other app they’re using) generates it for them.

    Interestingly, it’s one of Facebook’s board members that helped cause this schism between reading and writing on the web. Marc Andreessen pioneered the early Mosaic web browser, and then famously went on to spearhead Netscape, the first broadly-available commercial web browser. But Netscape wasn’t made as a publicly-funded research project at a state university — it was a hot startup company backed by a lot of venture capital investment.

    It’s no surprise, then, that the ability to create web pages was reserved for Netscape Gold, the paid version of that first broadly consumer-oriented web browser. Reading things on the web would be free, sure. But creating things on the web? We’d pay venture-backed startup tech companies for the ability to do that, and they’d mediate it for us.

    Notwithstanding Facebook’s current dominance, there are still a lot of ways to publish actual websites instead of just dumping little bits of content into the giant social network. There are all kinds of “site building” tools that let you pick a template and publish. Professionals have authoring tools or content management systems for maintaining big, serious websites. But these days, there are very few tools you could just use on your computer (or your tablet, or your phone) to create a web page or web site from scratch.

    All that could change quickly, though—the barriers are lower than ever to reclaiming the creative capability that the web was supposed to have right from its birth.

    The Missing Building Blocks of the Web

    Embedding (Transclusion!)

    Okay, this one’s nerdy. But I’m just gonna put it out there: You’re supposed to be able to include other websites (or parts of other websites) in your web pages. Sure, we can do some of that — you’ve seen plenty of YouTube videos embedded inside articles that you’ve read, and as media sites pivot to video, that’s only gotten more commonplace.

    But you almost never see a little functional part of one website embedded in another. Old-timers might remember when Flash ruled the web, and people made simple games or interactive art pieces that would then get shared on blogs or other media sites. Except for the occasional SoundCloud song on someone’s Tumblr, it’s a grim landscape for anyone that can imagine a web where bits and pieces of different sites are combined together like Legos.

    Most of the time, we talk about this functionality as “embedding” a widget from one site into another. There was even a brief fad during the heyday of blogs more than a decade ago where people started entire companies around the idea of making “widgets” that would get shared on blogs or even on company websites. These days that capability is mostly used to put a Google Map onto a company’s site so you can find their nearest location.

    Those old hypertext theory people had broader ambitions, though. They thought we might someday be able to pull live, updated pieces of other sites into our own websites, mixing and matching data or even whole apps as needed. This ability to include part of one web page into another was called “transclusion”, and it’s remained a bit of a holy grail for decades.

    There’s no reason that this can’t be done today, especially since the way we build web pages in the modern era often involves generating just partial pages or only sending along the data that’s updated on a particular site. If we can address the security and performance concerns of sharing data this way, we could address one of the biggest unfulfilled promises of the web.

    The Missing Building Blocks of the Web

    Your own website at your own address

    This one is so obvious, but we seem to have forgotten all about it: The web was designed so that everybody was supposed to have their own website, at its own address. Of course, things got complicated early on — it was too hard to run your own website (let alone your own web server!) and the relative scarcity of domain names made them expensive and a pain for everybody to buy.
    If you just wanted to share some ideas, or talk to your friends, or do your work, managing all that hassle became too much trouble, and pretty soon a big, expensive industry of web consultants sprung up to handle the needs of anybody who still actually wanted their own website—and had the money to pay for it.

    But things have gotten much easier. There are plenty of tools for easily building a website now, and many of them are free. And while companies still usually have a website of their own, an individual having a substantial website (not just a one-page placeholder) is pretty unusual these days unless they’re a Social Media Expert or somebody with a book to sell.

    There’s no reason it has to be that way, though. There are no technical barriers for why we couldn’t share our photos to our own sites instead of to Instagram, or why we couldn’t post stupid memes to our own web address instead of on Facebook or Reddit. There are social barriers, of course — if we stubbornly used our own websites right now, none of our family or friends would see our stuff. Yet there’s been a dogged community of web nerds working on that problem for a decade or two, trying to see if they can get the ease or convenience of sharing on Facebook or Twitter or Instagram to work across a distributed network where everyone has their own websites.

    Now, none of that stuff is simple enough yet. It’s for nerds, or sometimes, it’s for nobody at all. But the same was true of the web itself, for years, when it was young. This time, we know the stakes, and we can imagine the value of having a little piece of the internet that we own ourselves, and have some control over.

    It’s not impossible that we could still complete the unfinished business that’s left over from the web’s earliest days. And I have to imagine it’ll be kind of fun and well worth the effort to at least give it a try.

    The Missing Building Blocks of the Web

    In a similar vein, you may also enjoy this look at the lost infrastructure of the early era of social media.

     
c
compose new post
j
next post/next comment
k
previous post/previous comment
r
reply
e
edit
o
show/hide comments
t
go to top
l
go to login
h
show/hide help
esc
cancel