Objective-C的主观魅力
The Subjective Charms of Objective-C

原始链接: https://www.wired.com/story/objective-c-programming-language-verbose/

莱布尼茨曾梦想创造一种“通用特征符”,一种能够表达所有科学真理的完美语言。这种雄心壮志在编程语言中得以延续,开发者们努力构建足够表达力,能够消除bug,并足以自明其意而无需文档的系统。 作者被Objective-C深深吸引,尽管它冗长而备受争议,却能带来令人陶醉的自我表达感。Objective-C诞生于面向对象编程运动,它将C语言与可重用的“对象”结合起来,这些对象通过消息进行交互。它的生存很大程度上归功于乔布斯的NeXT公司,该公司用它来创建自己的操作系统。最终,Objective-C类似句子的结构和具有描述性的函数名,尽管复杂,却点燃了作者对编程的热情,并赋予了他无限创造可能的感受。

这篇Hacker News帖子讨论了一篇文章,该文章探讨了Objective-C的“主观魅力”及其所谓的衰落。 一些评论者不同意文章中对Objective-C的负面描述。用户bediger4000强调了NeXT的Objective-C环境的可用性和精心设计的库,并将它的困境归因于当时微软的统治地位。 vaxman提供了详细的历史视角,认为Objective-C最初的局限性源于美国企业对厂商锁定(vendor lock-in)的担忧,随后是Java的兴起以及NeXT转向WebObjects。后来,对iPhone应用稳定性的担忧导致考虑Java等替代方案。vaxman认为,随着Swift的引入,Objective-C的所谓消亡为时尚早,他引用了现有的代码库、性能优势以及大量基于Objective-C的AI代码模型训练数据。他们认为Objective-C仍然具有相关性,尤其是在自主代理应用(agentic apps)兴起的情况下。 另一位用户markus_zhang表达了普遍的编程疲劳。
相关文章
  • (评论) 2024-06-27
  • 启蒙件 2024-05-22
  • (评论) 2024-07-13
  • 我为什么用 Lisp 编程 2025-04-11
  • (评论) 2025-04-18

  • 原文

    After inventing calculus, actuarial tables, and the mechanical calculator and coining the phrase “best of all possible worlds,” Gottfried Leibniz still felt his life’s work was incomplete. Since boyhood, the 17th-century polymath had dreamed of creating what he called a characteristica universalis—a language that perfectly represented all scientific truths and would render making new discoveries as easy as writing grammatically correct sentences. This “alphabet of human thought” would leave no room for falsehoods or ambiguity, and Leibniz would work on it until the end of his life.

    A version of Leibniz’s dream lives on today in programming languages. They don’t represent the totality of the physical and philosophical universe, but instead, the next best thing—the ever-flipping ones and zeroes that make up a computer’s internal state (binary, another Leibniz invention). Computer scientists brave or crazy enough to build new languages chase their own characteristica universalis, a system that could allow developers to write code so expressive that it leaves no dark corners for bugs to hide and so self-evident that comments, documentation, and unit tests become unnecessary.

    But expressiveness, of course, is as much about personal taste as it is information theory. For me, just as listening to Countdown to Ecstasy as a teenager cemented a lifelong affinity for Steely Dan, my taste in programming languages was shaped the most by the first one I learned on my own—Objective-C.

    To argue that Objective-C resembles a metaphysically divine language, or even a good language, is like saying Shakespeare is best appreciated in pig latin. Objective-C is, at best, polarizing. Ridiculed for its unrelenting verbosity and peculiar square brackets, it is used only for building Mac and iPhone apps and would have faded into obscurity in the early 1990s had it not been for an unlikely quirk of history. Nevertheless, in my time working as a software engineer in San Francisco in the early 2010s, I repeatedly found myself at dive bars in SoMa or in the comments of HackerNews defending its most cumbersome design choices.

    Objective-C came to me when I needed it most. I was a rising college senior and had discovered an interest in computer science too late to major in it. As an adult old enough to drink, I watched teenagers run circles around me in entry-level software engineering classes. Smartphones were just starting to proliferate, but I realized my school didn’t offer any mobile development classes—I had found a niche. I learned Objective-C that summer from a cowboy-themed book series titled The Big Nerd Ranch. The first time I wrote code on a big screen and saw it light up pixels on the small screen in my hand, I fell hard for Objective-C. It made me feel the intoxicating power of unlimited self-expression and let me believe I could create whatever I might imagine. I had stumbled across a truly universal language and loved everything about it—until I didn’t.

    Objective-C came up in the frenzied early days of the object-oriented programming era, and by all accounts, it should have never survived past it. By the 1980s, software projects had grown too large for one person, or even one team, to develop alone. To make collaboration easier, Xerox PARC computer scientist Alan Kay had created object-oriented programming—a paradigm that organized code into reusable “objects” that interact by sending each other “messages.” For instance, a programmer could build a Timer object that could receive messages like start, stop, and readTime. These objects could then be reused across different software programs. In the 1980s, excitement about object-oriented programming was so high that a new language was coming out every few months, and computer scientists argued that we were on the precipice of a “software industrial revolution.”

    In 1983, Tom Love and Brad Cox, software engineers at International Telephone & Telegraph, combined object-oriented programming with the popular, readable syntax of C programming language to create Objective-C. The pair started a short-lived company to license the language and sell libraries of objects, and before it went belly up they landed the client that would save their creation from falling into obscurity: NeXT, the computer firm Steve Jobs founded after his ouster from Apple. When Jobs triumphantly returned to Apple in 1997, he brought NeXT's operating system—and Objective-C—with him. For the next 17 years, Cox and Love's creation would power the products of the most influential technology company in the world.

    I became acquainted with Objective-C a decade and a half later. I saw how objects and messages take on a sentence-like structure, punctuated by square brackets, like [self.timer increaseByNumberOfSeconds:60]. These were not curt, Hemingwayesque sentences, but long, floral, Proustian ones, syntactically complex and evoking vivid imagery with function names like scrollViewDidEndDragging:willDecelerate.

    联系我们 contact @ memedata.com