I think comrade glauber is incorrect. First of all, he got our mantra wrong. It is: The three characteristics of Perl programers: mundaneness, sloppiness, and fatuousness. Secondly, our language is not evolved to support no fucking real life no shit. Our language, is designed to be a fuckup from the very beginning.
Designed, to fuck up those computer scientists. Fuck up their teachings. Fuck up their students. Fuck up their language. Fuck up their correctness.
Fuck up their fucking theoretical theories. Remember, P is for Practical. Xah Lee in comp. Perl did some things well: It transcended implementation differences by staring them in the eye and fighting it out, not by giving up, whining that something isn't standard and portable, etc. It gave the bad standards and their nonsensical implementation differences the finger and wrote its own standard.
For the kinds of tasks Perl does well, it is downright impressively portable. You need the ever expanding Perl book, but not the tens or hundreds of shelf-feet of manuals that you used to have to deal with. Perl has created its own operating system interface precisely by being an incredibly ugly implementation, and I'll give Larry Wall this, but not much else: He did fully understand the value of uniform external behavior of a tool and he was willing to pay the price to get it. There is no excuse for the language he created in order to do this, however. What really pisses me off with Perl is that people work so hard doing so terribly little while they think they have worked very little doing something really nifty.
Dijkstra quoted in the program 10M. Another series of [philosopher's] stones in the form of "programming tools" is produced under the banner of "software engineering", which, as time went by, has sought to replace intellectual discipline by management discipline to the extent that it has now accepted as its charter "How to program if you cannot". Dijkstra : The threats to computer science EWD In a cruel twist of history, however, American society has chosen precisely the 20th Century to become more and more a-mathematical The suggestion that the programming problem could be amenable to mathematical treatment is, if heard at all, instantaneously rejected as being totally unrealistic.
As a result, Program Design is prevented from becoming a subdiscipline of Computing Science. And in the mean time, programming methodology —renamed "software engineering"— has become the happy hunting-ground for the gurus and the quacks.
Communities of Practice
The problems of business administration in general and data base management in particular are much too difficult for people that think in IBMerese, compounded with sloppy English. Robert and I both knew Lisp well, and we couldn't see any reason not to trust our instincts and go with Lisp. But we also knew that that didn't mean anything.
If you chose technology that way, you'd be running Windows. During the years we worked on Viaweb I read a lot of job descriptions. A new competitor seemed to emerge out of the woodwork every month or so. The first thing I would do, after checking to see if they had a live online demo, was look at their job listings. After a couple years of this I could tell which companies to worry about and which not to. The more of an IT flavor the job descriptions had, the less dangerous the company was.
The safest kind were the ones that wanted Oracle experience.
Programs – The Proverbs 31 Women's Organization
You never had to worry about those. If they wanted Perl or Python programmers, that would be a bit frightening - that's starting to sound like a company where the technical side, at least, is run by real hackers. If I had ever seen a job posting looking for Lisp hackers, I would have been really worried.
During the Bubble, Oracle used to run ads saying that Yahoo used Oracle software. I found this hard to believe, so I asked around. It turned out the Yahoo accounting department used Oracle. Paul Graham on database-backed web applications. Technology is part of the answer, not part of the question.
Don't make choices only to then try to figure out how to twist the problem in such a way so as to fit your choice. This will often result in your solution being more convoluted than my previous sentence. A few years from now, programming will have been revolutionized and lots and lots of work will be done by software that writes itself.
Language Design and Programming Quotes
This will requrie massive talent and intelligence and thinking outside the box and what have you, but for the time being, programming is a "consumer" job, "assembly line" coding is the norm, and what little exciting stuff is being performed is not going to make it compared to the mass-marketed crap sold by those who think they can surf on the previous half-century's worth of inventions forever. This will change, however, and those who know Common Lisp will be relieved of reinventing it, like the rest are doing, even in the "tools" world, badly. As long as the industry neither requires, nor rewards knowledge of fundamentals, why should we expect anything else?
- Akutagawa Ryunosuke [Jyashumon] (in Japanese).
- C with excellence: programming proverbs - Henry F. Ledgard, John Tauer - Google книги;
- C With Excellence: Programming Proverbs?
- The 50 Best Christian High Schools in America | aridadgesand.cf.
- Basics of Christian Education.
- Web Design Confidential: The whole truth on the state of web design;
The Mac operating system is like the monorail at Disney World. It's kind of spectacular and fun, but it doesn't go much of anywhere. Still, the kids like it. Unix is like the maritime transit system in an impoverished country. The ferryboats are dangerous as hell, offer no protection from the weather and leak like sieves. Every monsoon season a couple of them capsize and drown all the passengers, but people still line up for them and crowd aboard. It's there, but people just ignore it and find other ways of getting where they want to go. Posted by Paul A. Vixie to rec. You're posting to a Scheme group.
I am sick of seeing C code which initialises variables to values that never get used; that is lying to the reader, and lying to the reader is never good idea. Variables should only ever be initialised when you have a value that you intend to use that you can initialise them with. O'Keefe in squeak-dev mailing list, September If someone didn't understand their code and its likely uses well enough to write brief useful comments, why should I imagine that they understood it well enough to write code that works?
O'Keefe in squeak-dev mailing list, June Realistically, the practice of putting untested code into systems is common, and so are system failures. The excuse I've most often heard for putting in untested code is that there wasn't enough time or money left to do the testing. If there wasn't enough time and money to test the routine, then there wasn't enough time and money to create it in the first place.
What you think is code, before it has been properly tested, is not code, but the mere promise of code - not a program, but a perverse parody of a program. If you put such junk into a system, its bugs will show, and because there hasn't been a rigorous unit test, you'll have a difficult time finding the bugs. As Hannah Cowley said, "Vanity, like murder, will out. It is better to leave out untested code altogether than to put it in. Code that doesn't exist can't corrupt good code. A function that hasn't been implemented is known not to work.
An untested function may or may not work itself probably not , but it can make other things fail that would otherwise work. In case I haven't made myself clear, leaving untested code in a system is stupid, shortsighted, and irresponsible. GIGO "Garbage-in equals garbage-out" is no explanation for anything except our failure to test the system's tolerance for bad data.
- Queer Presences and Absences;
- C with Excellence: Programming Proverbs.
- DePaul University.
- Aikido and the Dynamic Sphere: An Illustrated Introduction.
Garbage shouldn't get in - not in the first place or in the last place. Every system must contend with a bewildering array of internal and external garbage, and if you don't think the world is hostile, how do you plan to cope with alpha particles? But to be really diabolical takes organization, structure, discipline, and method.
Taking random potshots and waiting for inspiration with which to victimize the programmer won't do the job. Syntax testing is a primary tool of dirty testing, and method beats sadism every time.
Finally, the idioms of a language are useful as a sociological exercise "How do the natives of this linguistic terrain cook up a Web script? Idioms are fundamentally human, therefore bearing all the perils of faulty, incomplete and sometimes even outlandish human understanding.
Yes, as a name, xnor generalises well to the n-ary case: I'm confused completely independent of the number of arguments passed to the function. These are not goals that Intel and Motorola understood, anymore than they understood anything important about SW in general. The current caching schemes are rudimentary to say the least. The more interesting architectures today are the graphics accelerators - they don't do anything particularly new, but they at least have some notion of what they are supposed to do and also what they don't have to do when Moore's Law makes it easy to have multiple processors.
Alan Kay in squeak-dev mailing list, March In fact, flow charting is more preached than practiced. I have never seen an experienced programmer who routinely made detailed flow charts before beginning to write programs. Where organization standards require flow charts, these are almost invariably done after the fact. Many shops proudly use machine programs to generate this "indispensable design tool" from the completed code.
I think this universal experience is not an embarassing and deplorable departure from good practice, to be acknowledged only with a nervous laugh. Instead it is the application of good judgment, and it teaches us something about the utility of flow charts. Nothing even convincing, much less exciting, has yet emerged from such efforts.
I am persuaded that nothing will. In the first place, as I have argued elsewhere, the flow chart is a very poor abstraction of software structure, Indeed, it is best viewed as Burkes, von Neumann, and Goldstine's attempt to provide a desperately needed high-level control language for their proposed computer. In the pitiful, multipage, connection-boxed form to which the flow chart has today been elaborated, it has proved to be essentially useless as a design tool - programmers draw flow charts after, not before, writing the programs they describe.