1 2
13 14 15 16
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
I am as ignorant of comic books as you are, so this is only the vague impressions I have got by watching/reading about them (rather than reading them directly). Of course I have read some comic books in the past (such as Hulk and Spiderman), but that was a long time ago, and I wasn't an afficionado even then.
Kyrsimys wrote:
2) Where to start? I know they've been publishing Superman comics, for example, since the 1930s. Do I need to start at the beginning to understand everything that's going on today? If not, what would be a good place to start? I understand the DC universe has just undergone some massive reboot, so should I just start from there or would I really be missing out if I did?
IIRC the DC universe has undergone something like 3 "reboots" (justified in-universe, rather than simply starting from scratch without explanation). I'm not expert enough to answer your question in a competent manner, but if I had to say something, then I'd say that if you want to get into the current DC universe (and have the resources available), start from Crisis on Infinite Earths (1985). AFAIK it's the first major (in-universe) "reboot" and some kind of start of the "modern" DC comics.
3) DC or Marvel? Do you feel one is clearly better than the other? What are the main differences between the two? Which one has more interesting characters?
I have got the impression that Marvel comics tend to be darker, grittier and more violent, while DC comics tend to be more idealistic and light-hearted (although there are probably individual counter-examples on both sides; Batman is probably such a counter-example on the DC side, although it probably depends on the writer).
nfq
Player (94)
Joined: 5/10/2005
Posts: 1204
Tub wrote:
You spent 2 minutes on an 8-page article before giving up? Even the fastest readers in the world couldn't have read it in that timeframe, much less understood it. You certainly didn't click any of the links to acquire the necessary background information, either.
*lolling out loud* thanks for the laughs again, you are a very humorous person. I don't have enough knowledge about computer stuff because I haven't been exposed to computers from early age, so it's hard for me to read that article. It's kinda like if you'd start reading a Wikipedia article about some specifics in quantum physics without having much background knowledge about it.
Kyrsimys wrote:
1) Which ones should I read?
Donald Duck comics, particularly those made by Carl Barks. The Akira manga is also worth reading. Manga is a Japanese name for comic.
Skilled player (1741)
Joined: 9/17/2009
Posts: 4981
Location: ̶C̶a̶n̶a̶d̶a̶ "Kanatah"
I got a question. My computer only lets me set dates ranging from 1980 to 2099. Is there a way to set the date further back, and further ahead?
Editor, Experienced player (570)
Joined: 11/8/2010
Posts: 4036
jlun2 wrote:
I got a question. My computer only lets me set dates ranging from 1980 to 2099. Is there a way to set the date further back, and further ahead?
The common computer clock only has a set range in which it can operate, and that range is from January 1, 1980 to December 31, 2099. The clock function cannot operate outside that range, and cannot be set any further forward or backward. The clocks were likely never intended to last past 2099, so they weren't programmed past that year. So when the year 2099 ends, our computer clocks will stop running.
Joined: 5/2/2006
Posts: 1020
Location: Boulder, CO
Can anyone explain what entropy is in a thermodynamic context? I believe that I understand entropy as the term is used in information theory, but articles like the one on wikipedia are greek to me. Is it that the concept in information theory is so different from what it means in thermodynamics that the association is just confusing me, or is there some relationship that I just don't see?
Has never colored a dinosaur.
Player (80)
Joined: 8/5/2007
Posts: 865
Twelvepack wrote:
Can anyone explain what entropy is in a thermodynamic context? I believe that I understand entropy as the term is used in information theory, but articles like the one on wikipedia are greek to me. Is it that the concept in information theory is so different from what it means in thermodynamics that the association is just confusing me, or is there some relationship that I just don't see?
There are a few different (almost equivalent) definitions of entropy. The grade school definition is the amount of disorder in a system. This has fallen out of favor among physicists for being qualitative, subjective, and in some cases, flat out wrong. The second definition (which I'm most comfortable with) is the log of the number of microstates of a system corresponding to a given macrostate. We might look at a gas in a box and count how many ways it can exhibit the pressure and temperature we see. If you take the log of this number and multiply it by the Boltzmann constant, you get the entropy of the system. A classic, simpler example is the magnetization of a paramagnet in an external magnetic field. We first observe the macrostate (the net magnetization) and then count how many microstates (individually flipped dipoles) correspond to that macrostate. Take the log, then multiply by the Boltzmann constant and you have the entropy. (Interestingly, if you go down that road with the paramagnet example, you'll find it can lead to negative temperature.) That's probably what you read on Wikipedia, and I acknowledge I haven't done a very good job of explaining it. The last definition-- which is possibly my favorite-- is that the entropy is whatever it needs to be such that the First Law of Thermodynamics is satisfied. The first law is: dU = T*dS - P*dV + mu*dN We add a little bit of energy to a system and after accounting for the changes in the volume and number of particles (if applicable), all other changes in energy must be in the T*dS term. Presumably, the energy added was small enough to leave T essentially unchanged so you're really measuring dS. It turns out that these last two definitions are compatible with each other. (I find the last definition to be a little more general, however. If you believe in conservation of energy, you believe in entropy.) Admittedly, statistical mechanics has never been my strongest subject. Marzojr will probably come by to school me... Also, I know pitifully little about information entropy. Could you give me a quick definition?
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Bobo the King wrote:
There are a few different (almost equivalent) definitions of entropy. The grade school definition is the amount of disorder in a system. This has fallen out of favor among physicists for being qualitative, subjective, and in some cases, flat out wrong.
Isn't a more modern, semi-informal definition that entropy describes how much energy there is available for useful work in a system? The amount of such energy in a closed system can never increase (which is why it's impossible to have a perpetual motion machine that produces extra energy from nothing). (Of course the problem with this definition is how do you define "energy available for useful work"...)
Active player (315)
Joined: 2/28/2006
Posts: 2275
Location: Milky Way -> Earth -> Brazil
Twelvepack wrote:
Can anyone explain what entropy is in a thermodynamic context? I believe that I understand entropy as the term is used in information theory, but articles like the one on wikipedia are greek to me.
http://www.youtube.com/watch?v=5bueZoYhUlg
"Genuine self-esteem, however, consists not of causeless feelings, but of certain knowledge about yourself. It rests on the conviction that you — by your choices, effort and actions — have made yourself into the kind of person able to deal with reality. It is the conviction — based on the evidence of your own volitional functioning — that you are fundamentally able to succeed in life and, therefore, are deserving of that success." - Onkar Ghate
Bisqwit wrote:
Drama, too long, didn't read, lol.
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
Warp wrote:
Isn't a more modern, semi-informal definition that entropy describes how much energy there is available for useful work in a system? The amount of such energy in a closed system can never increase (which is why it's impossible to have a perpetual motion machine that produces extra energy from nothing). (Of course the problem with this definition is how do you define "energy available for useful work"...)
That would be exergy: http://en.wikipedia.org/wiki/Exergy
Joined: 5/2/2006
Posts: 1020
Location: Boulder, CO
Bobo the King wrote:
Also, I know pitifully little about information entropy. Could you give me a quick definition?
I think that the simplest way of thinking about it is the amount of information gained by observing previously unknown values. It can be expressed in units that would be familiar to computer scientists (bits or bytes) but because it is a measure of information gained, entropy is zero if you can somehow predict the unknown values. If I told you that I flipped a double- headed coin 10 times and recorded the results, there is no information entropy because you already know what the result was, and gain no information by looking at what I recorded. If that coin was normal (as likely heads as tails), the results of the 10 flips would have 10 bits of entropy, because any guess you make at each of the 10 flip results is as likely to be wrong as right, so you gain 1 bit of information per flip. What makes it hard to think about is that the information gained in a case where one outcome is far more likely than another. If I had 10 lottery tickets last week, the amount of information gained by finding out if they won the lottery is almost zero, because for each ticket, it is really unlikely that it was a winner. This result can be predicted with a high degree of accuracy, so the entropy is small.
Has never colored a dinosaur.
Editor, Skilled player (1536)
Joined: 7/9/2010
Posts: 1319
Question: Should I learn Lua scripting for TASing?
Favorite animal: STOCK Gt(ROSA)26Sortm1.1(rtTA,EGFP)Nagy Grm7Tg(SMN2)89Ahmb Smn1tm1Msd Tg(SMN2*delta7)4299Ahmb Tg(tetO-SMN2,-luc)#aAhmb/J YouTube Twitch
Player (80)
Joined: 8/5/2007
Posts: 865
TASeditor wrote:
Question: Should I learn Lua scripting for TASing?
Answer: Yes.
nfq
Player (94)
Joined: 5/10/2005
Posts: 1204
Question: If I'm playing an FPS game and I'm facing a wall, does the rest of the game-world exist behind me? Maybe somewhere in memory at least, right?
Bobo the King wrote:
Answer: Yes.
What can it be used for?
Player (80)
Joined: 8/5/2007
Posts: 865
nfq wrote:
What can it be used for?
What do you want to do? Lua can do it.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
nfq wrote:
Question: If I'm playing an FPS game and I'm facing a wall, does the rest of the game-world exist behind me? Maybe somewhere in memory at least, right?
No, it's completely destroyed. When you turn around, it magically appears back from nothing.
nfq
Player (94)
Joined: 5/10/2005
Posts: 1204
Warp wrote:
No, it's completely destroyed. When you turn around, it magically appears back from nothing.
lol, I doubt it's that simple, you must be sarcastic... would be interesting to know a bit how it works. I don't know so much about these computer things as most people here. In dreams and reality, things exists in our memory when we're not looking at them, and I've heard that it's similar for games, that things always exist somewhere in memory.
Editor, Reviewer, Experienced player (979)
Joined: 4/17/2004
Posts: 3109
Location: Sweden
nfq wrote:
lol, I doubt it's that simple, you must be sarcastic...
Your sarcasm detector is working. Warp did not give an intelligent answer. In an FPS game, traditionally, the whole level is loaded into memory. Depending on the player's location and direction, the game engine decides what the player sees, and draws those objects on the screen. This article has some info on how a simple engine, the Doom engine, chooses what to draw on the screen: http://en.wikipedia.org/wiki/Doom_engine In more recent games there are probably mechanisms for unloading parts of the world which are distant and loading parts which are close. For example, I doubt that GTA4 keeps the whole city in memory.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Truncated wrote:
In an FPS game, traditionally, the whole level is loaded into memory.
That may have been so in the past (and might still be so for the simplest 3D games), but 3D game levels tend to nowadays be so large and with so much detail that scenery is usually loaded dynamically from disk as needed, rather than have every single part of the enormous level, with every single tiny detail, loaded at once. This is especially true for open world sandbox games, but it's also true for most of the linear FPS railshooters as well. (For example, if you play the HL2 expansion Lost Coast, there's a point where you can stack up objects in order to go over a fence you are not supposed to be able to, and that way you can return to a previous part of the level, and you'll see how most of the details are gone, and only very large, rough polygons are showing. The game is keeping only a very low-detail version of the previous parts of the level, which might be visible from the distance, but which should not be reachable normally.) Naturally when drawing the visible part of the level, the game engine tries to minimize the amount of data sent to the graphics card. (The most optimal way of rendering the scene is, of course, if only those polygons are sent to the graphics card that affect the screen pixels, and nothing more. This is actually quite complicated to achieve perfectly, so there will always be extraneous polygons sent which will not end up visible in any way, usually obscured by other polygons.) The more polygons are sent to the graphics card to be drawn, the heavier it is to draw it and thus the lower the framerate; thus the reason why this is optimized as much as possible. Usually only polygons in front of the camera are sent to the graphics card. (Of course that's only the first step in pruning away hidden polygons. The next steps are much more complicated.)
Joined: 7/2/2007
Posts: 3960
As far as existing in terms of physics, it depends on the game. But generally for singleplayer games the player effectively carries a sphere of life around with them; everything within a given distance of the player exists even if they aren't looking at it, everything further away doesn't exist. There's little point in simulating a pedestrian walking into a lamp post on the other side of the city when that will have a practically nil impact on the player. I would expect that in a multiplayer game, everything is simulated. Otherwise someone would find some way to turn the inactivity to their advantage...
Pyrel - an open-source rewrite of the Angband roguelike game in Python.
Joined: 5/2/2006
Posts: 1020
Location: Boulder, CO
I guess what I am really getting at is how does
Bobo the King wrote:
The second definition (which I'm most comfortable with) is the log of the number of microstates of a system corresponding to a given macrostate. We might look at a gas in a box and count how many ways it can exhibit the pressure and temperature we see.
relate to
Warp wrote:
Isn't a more modern, semi-informal definition that entropy describes how much energy there is available for useful work in a system
I really don't see any connection between these two concepts. They both make sense on their own, but I keep getting confused trying to bridge the gap. Are they just separate ideas that share a name?
Has never colored a dinosaur.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Twelvepack wrote:
I really don't see any connection between these two concepts. They both make sense on their own, but I keep getting confused trying to bridge the gap. Are they just separate ideas that share a name?
Don't ask me. I have never heard of anything like Bobo wrote nor do I understand a word of it, but then, I'm not a physicist and I only have an extremely cursory knowledge of these things. Anyways, as I said, what I wrote is a more or less informal definition of entropy. How you define it in terms of physical laws and quantities is beyond my knowledge.
Joined: 5/30/2007
Posts: 324
Bobo the King wrote:
The second definition (which I'm most comfortable with) is the log of the number of microstates of a system corresponding to a given macrostate.
Yes, this is the standard definition of entropy given in all the basic quantum, thermal, and statistical mechanics textbooks. It's usually written as (s), where Ln is the natural logarithm, and s the number of states. Meanwhile, Warp's "informal" definition of entropy appears to be vague and very likely wrong. According to the pw3ner's posts, it's a completely separate physical property.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
IronSlayer wrote:
Meanwhile, Warp's "informal" definition of entropy appears to be vague and very likely wrong.
Care to explain us mere mortals what exactly is so wrong about it?
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
Energy available in a system is measured in Joules, so we can conclude that it has dimension of energy. Entropy is measured in Joules/Kelvin, so it has dimension of energy divided by temperature. Since the dimensions of the units in question are different, they can never describe the same physical quantity. Moreover, entropy is a property of the system only, whereas how much work you can extract from a system depends on the neighborhood of the system. You can see how this works with some basic thermodynamics. Consider your system as the hot source and its neighborhood as the cold one, and put a Carnot machine between them, the colder the neighborhood, more efficient your machine is, so such property must also take into account the neighborhood (this is a horrible heuristic argument, but the best I can do). So, as you see, entropy is system only, exergy is system+neighborhood, they can't be the same. Finally, for irreversible processes, the entropy of the system + the entropy of the neighborhood always increases, while exergy always decreases.
Former player
Joined: 1/17/2006
Posts: 775
Location: Deign
Just quoting wikipedia first sentences in an article here: Gibbs free energy seems to match closely to the definition Warp gave for entropy "In thermodynamics, the Gibbs free energy (IUPAC recommended name: Gibbs energy or Gibbs function; also known as free enthalpy[1] to distinguish it from Helmholtz free energy) is a thermodynamic potential that measures the "useful" or process-initiating work obtainable from a thermodynamic system at a constant temperature and pressure (isothermal, isobaric).". entropy seems to be the amount of energy 'lost' or 'wasted' by a system (or rather, it is proportional to it), rather than the energy available "Entropy is a thermodynamic property that can be used to determine the energy not available for work in a thermodynamic process, such as in energy conversion devices, engines, or machines." enthalpy is the total energy available in the system. "Enthalpy is a measure of the total energy of a thermodynamic system."
Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign aqfaq Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign Deign
1 2
13 14 15 16