Signal Types – First Draft

Standard

30Nov13.17h36

SIGNAL TYPE:

A. SIGNAL NOT REQUIRING RESPONSE (NOTICE OR IGNORE)
B. SIGNAL REQUIRING RESPONSE
i) NOT URGENT (RESPOND AT WILL)
ii) URGENT
a) URGENT NOT REQUIRING ACTION (RESPOND, SPECIFY CURRENT STATE)
b) URGENT REQUIRING ACTION
SPECIFY ACTION TYPE OTHER IMPORTANT DETAILS
c) EMERGENCY => URGENT, REQUIRING IMMEDIATE ACTION.
* ATTEND TO IMMEDIATELY, PREPARE FOR THE WORST, HOPE FOR THE BEST.
C. SIGNALS REQUIRING ACTION AND OTHER RESOURCES:
SPECIFY:
* ACTION TYPE & OTHER IMPORTANT DETAILS.
* RESOURCE TYPE(s) + AMOUNT(s) => PREPARE FOR WORST CASE
D. RELAY: SIGNALS REQUIRING SIGNALS:
REQUIRING CONTACTING OTHER PEOPLE =>
SENDING SIGNAL:
– DOES IT REQUIRE A RESPONSE?
– DOES IT REQUIRE AN ACTION?
– SPECIFY TIME FRAME.
– SPECIFY URGENCY/PRIORITY.
– DOES IT REQUIRE A RESOURCE?
– DOES IT REQUIRE ANOTHER PERSON? (RELAY)
– SPECIFY FURTHER. PREPARE FOR WORST CASE.

Python and Me; Or, The Git That Keeps On Gitting

Standard

There are probably as many ways to learn computer programming and theoretical computer science as there are individuals wanting to learn such things. For me, it all came down to mathematics, believe it or not. It was easy enough to learn the syntax and so on for the Python language, but as soon as I wanted to do anything worthwhile with Python, there suddenly were a dozen branches of mathematics I felt I needed to brush up on.

I am willing to concede; maybe it’s not absolutely necessary to teach computer programming to kids in school. But the fact that they never taught us any really interesting and useful mathematics, for that I hold them guilty. They had what, twelve years to teach us math? And in over a decade the furthest they could take us was basic trigonometry? Are you kidding me?

In the last two years, I was able to teach myself more math than I could shake a stick at. The same goes for theoretical computer science. What were they trying to do, anyway, teach us or just waste our time? In any case, that’s unimportant now. The point is that I felt the need to learn some more or less "advanced" math because I quickly learned that the better I got at general "mathematical thinking", the easier it would be to write computer programs.

That is, if my goal was to write interesting computer programs. And by "interesting", I mean short programs that help me do insane calculations I couldn’t otherwise do on paper. Why else would I want to program a computer? I want to push it to its computational limits. It’s not complicated. I’m not in this for the entertainment value of reading error messages all the livelong day. I’m in this to break my computer, always was, always will be. I want to break the bounds of human thought. Maybe I’m funny that way.

I like problems that quickly become intractable. I like to think the impossible and then go forward with the idea that I can do the impossible. It doesn’t always work, but sometimes, every so often, the impossible happens. And, fortunately or unfortunately, in order to augment my ability to make the impossible happen, I had to learn a whole bunch of math.

Why does this matter? It matters in a more or less non-trivial way. I think that the sooner one realizes that computer science and mathematics are not only related but are more or less the same thing, one is a branch of the other, the quicker one can advance in either of those sciences. What people aren’t realizing is that there is currently a revolution in mathematics. The entirety of the mathematical apparatus is currently being rewritten in terms of Type Theory, for one. What this basically means is that you can learn the entirety of the mathematical apparatus starting from the new perspective of Type Theory. You can actually start there, you don’t need the rest. This is the new normal. You can start teaching kids basic arithmetic in terms of types and so on. No need to teach them 20th century mathematics, it’s all being rewritten as we speak anyway.

The beauty of that is that finally computers and mathematicians will be speaking the same language. Finally, after over a hundred years of speaking foreign tongues, mathematicians and computers will be speaking the same language more and more. I stumbled upon this entirely by accident. I only wanted to brush up on a little math to make more sophisticated kinds of programming easier for myself. Then oops, I accidentally crunched most of modern mathematics in my head, one concept after another, in a two-year math crusade.. and developed a full-blown mathematical ontology, again, entirely by accident. Then when I came out of my stupor, I learned that that’s exactly what the top 30 or so mathematicians in the world were currently working on.

They had come to the same conclusions, except I only had high school math, barely understood basic trigonometry. Anyhow, I will get back to this, but tell me about it, I spent two years learning Python when I could have spent two years at Princeton at the Institute of Advanced Study doing just about the same darn thing! So IAS School of Mathematics, you know where to reach me. I am being more or less facetious, but I definitely wouldn’t refuse an all-expenses-paid voyage to the IAS in Princeton, New Jersey. It should only take me a couple of months to get my math up to date, at par with the other researchers. 😉

Python and Me; Or, Going on Year Three

Standard

I started learning the Python programming language a little over two years ago now. I have kept ample notes and now want to share some of my documentation of the learning process with the world.

It has been a long ride. I have learned much, yet I’m still very much a beginner. Learning to code was not what I thought it would be. In many ways, it was much better, but it was also much more difficult than I had anticipated.

I do think, though, that if I could learn to program, anyone could. It takes a lot of discipline. One has to be dedicated. One has to practise everyday and do lots of reading through documentation. But it is doable. I am proof of that.

I started knowing very little about programming. The only programs I had ever written were in BASIC about 30 years ago where I copied them out of a textbook line by line. I knew enough HTML to maybe edit an HTML file and get it to actually work every once in a while. I didn’t even know CSS, though, and so for all intents and purposes, I really was starting from scratch.

But I did it. I’m not any good, but I can write programs that work. They are simple programs, but every day they get a little more sophisticated. It’s actually not all that mysterious. I wish they had just taught us to program in school. If they could teach us trigonometry, they could teach us Python just as easily.

We had French classes in school and they were able to teach us French grammar. I am starting to think that Python’s grammar is much more simple than French grammar. I do think that in many cases it would be easier to learn to write in Python than it would be to learn to write in French. Writing properly in French takes a whole lifetime. I know people in their 70s and 80s and beyond who still make grammatical errors. Surely, experienced Python writers are likely to still make mistakes, but you know what I mean. It doesn’t take a lifetime to learn to code, at least code something or other, even if it is simple.

Python, however, is much less forgiving than any language like French or English. I can still communicate in English, say, and make mistakes. What I mean is that I won’t get an error message slapping me in the face. And that’s actually what made it so much easier to learn. Imagine learning a new language, like German or Italian or something and having your very own private interpreter. They are there all the livelong day, telling you about every single error you made. Like a personal German teacher. And they never let a mistake slip by. So you tend to learn pretty quickly, even if only by trial and error.

In any case, I hope to start posting more often on this blog, sharing my experience with Python with the world. Stay tuned for updates. 🙂

Thinking Things Through

Standard

I argue that it is important to properly think things through. I believe that one not only needs to think things through, but that one needs to be systematic and methodical. This applies to almost anything I can think of. I fear that I all too often see people not being careful enough to properly think things through. Simple examples follow, to drive the point home.

You have an idea. It could be anything; you have an idea. It could be great, wonderful, mediocre, it doesn’t really matter. You can benefit from thinking things through, from systematically going through every aspect of the idea either to improve it or to prepare yourself for potential risks or roadblocks. The decision to tie or to untie one’s shoes is simple enough to accomplish from start to finish: One does not necessarily have to consider extreme existential risks when deciding to tie or to untie one’s shoes. Unfortunately, most "ideas", call the ideas, projects, objectives, goals, programs, what have you, are not exactly like tying or untying one’s shoes.

At the same time, one does not want to speculate and philosophize too much, what is often referred to as Analysis Paralysis. However, it remains true, I think, that at times what you think you want is not really what you want. Say you have an objective, a project, a goal. Call it X. X in this case I would simply call "something that you want". So you have stated your goal, that you want X. But is X really what you want, or is it just what you think you want? Is X really the X you think it is, or is it a Y or a Z instead?

Taking the time to carefully think things through can be such a useful tool for this sort of thing. It means that you will have examined the thing closely, with meticulous care, and should be prepared to follow through with it while being attentive to potential threats, barriers, ricks, roadblocks, etc. But if you do not systematically and methodically think things through, then you are leaving yourself vulnerable to "attack" (An attack from an adversary might be how one would frame it in terms of information security, but the same thing holds for many other domains, i.e. an "attack" a.k.a. "risk" from an "adversary" a.k.a. "Nature". It doesn’t have to be an actual attack per se, it just means a situation where you are vulnerable to disadvantages).

Embracing The Patternless Uncertainty of Being

Standard

And that’s the problem. That perennial quest to solve all creative tensions, that good old-fashioned pattern-seeking brain of ours, is going to kill us if we aren’t careful. One must stop chasing after butterflies and cease trying to reduce uncertainty and embrace it instead. Or keep chasing after butterfly patterns in the wall-paper if it helps you sleep at night.

That was what I have been saying all along, the perils of too much abstraction, of abstraction for the sake of abstraction. What happens when you craft a message that has zero uncertainty?

Let’s put it this way. You have a two-sided coin (unfair), with both sides being Heads. With regard to the results of a coin toss with this two-headed coin, there is not uncertainty, hence there is zero entropy. Unless you are betting against an adversary who always chooses Tails, then tossing this coin will always result in now new information. A pointless exercise?

I see that as involving the same kind of dangers as being right all the time. My predictions with this unfair coin are always right. It is always Heads, exactly as I predicted, every time. My predictions are valueless, however. There was no real uncertainty to begin with.

Finding a set of patterns and then seeing those patterns repeat themselves everywhere is a very perilous game to play, if you ask me. As I said in the beginning, one should not always seek to reduce uncertainty, one should embrace it instead. I could go on and say the same thing about Noise, because it can be just as valuable. When all you are left with are beautiful signals, then call me. I am the noise-maker, I will fix those signals stat, render them inert, unrecognizable. I will take out the pattern and leave only the patternless uncertainty of being. Embrace is, or keep chasing after butterflies.

Sound Design – Theory or Practise?

Standard




I was fortunate enough to undertake the formal study of computer-assisted sound design, many ages ago. I will always remember being asked if it was worth it, if it was worth it to go back to school, or put more simply, if the education I was getting was worth the cost of the education. I was asked this countless times, while I was at school and after I graduated. My answer was almost always the same, "It depends."

It depends on what it is you are studying. It depends on how much you value what you are learning. It depends on a lot of things. One might have thought that if I was going to learn sound design, that I could have invested in equipment instead and just learned it on the fly, practising on the equipment itself, instead of taking a bunch of classes full of theoretical knowledge. That might be true, but a) recording equipment goes out of style and b) great recording technique does not. Also, I’m sure that the young Mozart had one or two music lessons.

However, I always welcomed the question because I felt it was pertinent. I wanted to do sound design, right? So why not just do it instead of learn about doing it? I could have, and I more or less did both in any case. The problem was that I had already been doing sound design, sound recording, and wanted to take the craft to another level. I had already made it some distance on my own and decided to take a short cut. I would study with professionals, they would teach me their best practises, and it would cost me something. Otherwise, I very well could have tried to do it on my own, and now, almost 20 years later, I might still be trying to figure a whole slue of things out that I didn’t have to figure out because someone just told me.

I liken it to trying to teach oneself Calculus. I tried to teach myself calculus once. I spent a total of about 12 years and made almost no progress whatsoever. Two years ago, more or less, I started taking MOOCs, Massive Open Online Courses, and happened to take a course or two on Calculus, or else courses that required some level of calculus that I did not have. It turns out calculus is not that hard, it’s really not any more difficult than anything I learned in high school. We did trigonometry in high school, but not calculus, and now I wonder why we weren’t taught both at the same time, or why mathematics in general was taught at such a slow pace in school. But that is another story. The point was that calculus was much easier to learn with a teacher than it was on my own. I’m still not that great at calculus, but still in 2 years of taking MOOCs, I learned infinitely more about it than I did in 12 years battling it out on my own.

However, this isn’t true of everything. Believe it or not, calculus and sound design have something in common: They are both technical. Granted, anything can be technical depending on how we frame it, just as it could be said of almost anything that there was a Right and a Wrong way to do it, i.e. there’s a Right way to do Historiography and a Wrong way to do Historiography. The difference is that in calculus or in many other branches of mathematics, you either solve a problem or you don’t, just as in sound design, you either produce great sound design or you don’t. Historiography in that sense might be the subject of debate, but audio engineering, sound recording, etc., you either did a good job or you didn’t, it either sounds great or sounds like spaghetti. I will come back to this.