Posts for EEssentia


1 2
15 16 17 18
Experienced Forum User
Joined: 4/13/2009
Posts: 431
The cinematics are just annoying, if you ask me. They are not part of the true TAS--they're skipped as fast as possible! They're not really meant to be watchable/readable. Thankfully, real media players make it a cinch to skip them.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Sheesh. People, calm down. No need to flame and bash each other (plus it's against the rules). Everyone has opinions, but no one should be flamed for them. Everyone needs to keep the discussion at a friendly level. (I shouldn't have to mention this, nor should anyone else for that matter.)
andrewg wrote:
"We also prefer quality over quantity - a poor quality run will not be accepted whether it is a game new to the site or an improvement to a pre-existing run" I'm thinking that since this run is improvable by so much, it really shouldn't be accepted. What about if I beat this run in real-time? Would TASvideos accept it?
I like to see it from a different view. Just because there are tricks and routes that aren't in there doesn't automatically mean it's lower quality. Quality to me is the overall impressiveness of the movie. If it looks outright bad, then no matter how fast it is, reject it. But if that isn't the case, even if it suboptimal, then I say accept it, because clearly it meets the quality demands of the site, even though it could be better. But then again, we don't reset a run everytime we need a new route or trick, do we?
Experienced Forum User
Joined: 4/13/2009
Posts: 431
I don't see what the problem is. This IS the best TAS available at this time. When the new TAS comes out, your TAS, it will blow this one out of the water, and you will show the world what TRUE OOT TASing is! Call this one a ... teaser, if you will.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Tub wrote:
There isn't too much overhead in virtualisation. The faster CPU will probably offset the virtualisation overhead in the most common scenarios. (Unless you're running IO or GFX-heavy applications, but no amount of CPU-power will make that faster).
I would say that virtualization can take up quite a lot of CPU power. Nothing to scoff at.
no, because of the inherent difficulty of providing the same features within smaller space. Netbooks aren't really more expensive though, they usually range between 200 and 400€, while notebooks are 300 to 1000€.
I wouldn't say that. There is no profit in a small cost. Therefore, they try to include stuff to bring the price of netbooks up. If you want a reasonably good netbook, it's going to cost $300+, at the very least. There's even those priced at $500+! There's also "ultra portables", which are really a better option, IMHO. They are higher priced, but they try to bring the best of both worlds: good battery life with performance. This is a highly biased opinion, but netbooks are crap. The performance of them are just absolutely disgusting. Owning one and using one have made me come to this conclusion. My advice is: don't buy anything with an Atom CPU. And don't buy anything with a resolution lower than 768 pixels in height (most netbooks have a resolution of 1024x600). Again, purely biased opinion, but it's what I think. Definitely try one out before purchasing.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Dacicus wrote:
What about VMWare? Sysinternals has a program (disk2vhd) that lets you clone a HDD and convert it to a virtual machine. I'm thinking about doing that with this laptop's HDD for those programs that will refuse to work under newer versions of Windows. I hope there aren't too many of those, but I assume that might require more CPU power than a command-line only VM of Debian.
7 Professional and higher has XP Mode: a mode where you can install your XP applications and run them from inside Windows 7 just as normal programs. Only they run through the XP OS. Of course, this requires a CPU with virtualization, and Atom isn't one of them (Core 2 duo is a model that has, but not all models have it, so check first; thanks Intel!).
No, I knew what you meant; I just wrote about an unusual way to interpret what you wrote. To get back to your real meaning, I remember reading an online article a few weeks ago about Win7's new method of handling RAM wherein it tries to fill the RAM with files and programs it predicts (somehow) you might use most frequently. I don't remember too many details because I didn't anticipate having to switch to Win7 at the time, but someone brought up the point that this might wear out RAM faster than previous versions of Windows. Unless I misunderstood how that works, it doesn't seem like an improvement.
Wear out RAM? Can that even happen? It is known that using up all RAM consumes marginally or evenly as much power as just a trickle of it, so why not use all your available RAM for something good, such as cache? Vista did it. Linux does it. Everything does it now. It's for the good.
You're saying that netbooks cost more just because they have a longer battery life (and lack a ton of near-essential features)?
You could see it that way. The price / performance ratio is certainly worse. The likely cause of that is that netbooks used to be so cheap that companies didn't get any revenue for them. So they pushed up their price to what we see now. It's unfortunate, but true.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Dacicus wrote:
I agree with all those points, but one purpose of this topic was to learn if there are any specific models or brands that have been problematic for multiple people, which I'm assuming would indicate they are more likely to be problematic for me.
The thing is that you will get widely different answers, and none is better than the other. Everyone loves brands. Some more than others. Some have had just problems with one, while the other hasn't had any. That is why bringing up brands is bad, if you ask me,
Probably not, but I'd like to have the option. [As a totally unrelated aside, I found your comment about taking advantage of hardware better with Win7 somewhat ironic in that it is my understanding that you can't actually access the hardware directly in protected-mode OSes like Windows. I know that's what what you meant.]
I can assure you, I have never needed the floppy in years. Newer operating systems don't need them. If you really, really, really, really, for some silly reason do need one, you can always get an external one. If you can get your hands on one, that is. They're pretty outdated by now. As for the hardware no, you misunderstood me. Windows 7 takes better advantage of memory, power management, SSD drives, multi-cores, etc than XP. That's what I meant. And it will likely be doing so for a future to come. I can't see USB3 support coming to XP, for example. But you can bet your hat it will come to 7 ;) Now, for the rest, you need to choose between: - Faster, better performance, more value for the money, DVD-drive, bigger screen, medium battery life (laptop) OR - Slow, expensive, no DVD-drive, small screen and keyboard, but good/great battery life (netbook).
Sir VG wrote:
Netbooks are slow and non-upgradable.
Um, what? I can certainly change some basic things, like the HDD and RAM if I want to. I'm sure there's other things that can be changed out if you feel like tearing it apart, just like any other laptop.
Yes, you can upgrade RAM (one stick only usually) and swap HDD, but not much more than that. Laptops are pretty much all-upgradable, except for the gfx card. Also, beware that Javascript, Flash and Silverlight will suck the life out of your Atom CPU, so I'd still argue that a non-Atom CPU (a real cpu) is best for optimal web experience.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
I don't remember now, but I do believe the possibility must have occured to me... But I might just try it out again in the future. There is another thing, though. It seems that CRF might also vary greatly depending on the source, and that just kindof seems to defeat the purpose of it. Except that it's faster since it's 1-pass.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
I'm not sure if we're implying the same thing! In my experience, CRF consumes way too much bitrate in hitting a desired "quality" factor. I may need to experiment with this feature, but my results so far have shown that I have been able to hit much lower size with 2-pass bitrate instead (with same visual quality, basically) of CRF (which I used before). But anyway, this is not a fact. This is merely my experience. Others may vary. Not enough expert in that area to bust a myth, if there is one.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Aktan wrote:
You guys are all very late. I already helped him and he's done now.. lol
Yeah, I figured!
EEssentia: While 2-pass does distribute the bits better, it's quality is only slightly better than CRF. The main problem is guessing the bitrate. Unless you wanna reencode the file 10 times just to find the bitrate you "like", CRF is a lot faster. There is a middle road though. That is encode in CRF first just to find what bitrate, and then encode it again in 2-pass. It's basically like 3-pass.
The problem is indeed the bitrate, but you can get the same quality with substantially less bitrate with 2-pass bitrate, so that is why I would recommend it over constant rate factor. It may be in the range of 2-3 times more efficient.
Also MeGUI is nice and newbie friendly, except last time I checked, the x264 MeGUI updates to is really outdated.
Thankfully, you can replace those with up-to-date ones fresh from x264.nl ;)
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Meh. This is old, but anyway. In my experience, bitrate 2-pass compresses way better than constant rate factor. Hit the right bitrate and it will be way smaller than the same quality as constant rate factor. This is using MeGUI's profiles, so there shouldn't be any problems, I hope. And no, MeGUI isn't outdated, nor is it bad. It is a very good encoding tool. Remember that it only loads .avs files, so any avisynth that you think is needed can be added to the .avs file loaded into MeGUI. For audio, I would just recommend BeHappy. BeHappy and Nero's AAC tool is what I use to encode. Terrific quality at low bitrates and no messing with extraction or command-line tools. It's a snap, and it's fast.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Honestly, the video tags in today's browsers is pretty crappy. Even Flash 10.X performs better usually with hardware acceleration. Otherwise they're pretty close (IOW equally crappy). I second this: don't buy Atom. It's such a weak CPU. It definitely won't last you a lot of years! Also, drop the floppy. Do you honestly need one today? I think you'll be surprised how little you do. It's dead technology. Let it remain dead. Then there's the XP vs 7 argument. It's simple. Go 7. XP is old, and won't be supported for much longer. Please do us all a favor and drop XP. Then developers can concentrate on Vista as the minimum platform, giving us a much richer experience as a result. Plus a lot of other benefits, such as taking more advantage of your hardware, and updated for the future. Now that there is 7, there is simply no reason to stay with XP, since you can skip the "terrible" Vista :) Other than that, it's sticks and stones. Brands are good for some, bad for some, so everyone usually recommends something different. Go for what seems to fit you the best, I say. And definitely go for a notebook, not a netbook. Netbooks are slow and non-upgradable. You can save yourself some money by upgrading some components later to breathe some life into a dying system. Then there's the question of what do you actually need? Do you need SATA? Do you need Core i? Do you need fast DDR2 memory? Or maybe DDR3? Is it important to be able to have a lot of memory in there, like say, 4 GB? Or is 2 GB enough? Those are some of the questions I would look into. Then I would find some models, read the reviews, and choose the cheapest.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
No, what it needs is ffdshow ;) Downloadable works too! Thanks for the encode. Look interesting!
Experienced Forum User
Joined: 4/13/2009
Posts: 431
How about Fraps to capture? In case Mupen supports playback. But it does that, doesn't it? Fraps should only capture a specific window.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Size is irrelevant! So long as it is a reasonably fast server. But this is awesome. Will download right away!
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Seconded. Actually being able to watch it would be nice, so that I can comment on the (obviously) awesome run!
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Zurreco wrote:
Those smv links are useless without streaming media links as well!
So true...
Experienced Forum User
Joined: 4/13/2009
Posts: 431
adelikat wrote:
It was intended to get simple user feedback. Is downloading higher quality movies at a larger filesize desireable?
Yes! I always go for the highest possible quality, regardless of filesize!
Experienced Forum User
Joined: 4/13/2009
Posts: 431
One long TAS with lots of goodness? Yes, please :)
Experienced Forum User
Joined: 4/13/2009
Posts: 431
o_O No, just nonsensical, inefficient languages...
Experienced Forum User
Joined: 4/13/2009
Posts: 431
NameSpoofer wrote:
Mix it up eh? Post some C# and Java lines. I'm currently taking these 2 classes so give me a reason to be on here during class. :P
public class Test {
   public static void main(String[] args) {
     String s = "Java";
     StringBuffer buffer = new StringBuffer(s);
     change(buffer);
     System.out.println(buffer);
   }

   private static void change(StringBuffer buffer) {
     buffer.append(" and C#");
   }
}
<3
Oh, I so hate Java... and C#.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Warp wrote:
The ?: operator in C/C++ can be quite handy to express things more briefly, but should of course be used with care. The real fun begins when you start nesting ?: operators. It can get quite confusing quite soon. Even then, though, there are still situations where nesting ?: can be acceptable. For example, I consider this to be an acceptable usage:
bool Point::operator<(const Point& rhs)
{
    return x != rhs.x ? x < rhs.x : y != rhs.y ? y < rhs.y : z < rhs.z;
}
Not that an if-elseif-else block wouldn't do the same, but it's more verbose.
If you do that, then you need to format it so that it is more readable. For example:
bool Point::operator<(const Point& rhs)
{
    return x != rhs.x ?
        x < rhs.x : y != rhs.y ?
        y < rhs.y : z < rhs.z;
}
It's funny how much this thread has derailed from C to C++, though... :rolleyes:
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Whoa, you poked holes in all my arguments. But you are right.
Warp wrote:
The "using namespace std;" is a really, really bad habit that approximately 100% of beginner C++ programmers have (the percentage begins to lower the more advanced the programmer is). This sometimes goes to such ridiculous extents that you may even see a short example code which has a "using namespace std;" in order to save writing one "std::" prefix in the later code (thus the total amount of writing that was saved was -16 characters). What most C++ programmers (beginner and intermediate) don't seem to realize is that the "std::" prefixes actually make the code easier to read, not harder. For example, suppose you have a line of code like this:
search(i1, e1, i2, e2);
Is that calling some local function, perhaps some member function of the current class, a function defined in some of the header files included in this file, or maybe a function in the standard library? It's impossible to say from that line alone. However, suppose the line was written as:
std::search(i1, e1, i2, e2);
Now there isn't even a shadow of a doubt what this function is: A function in the C++ standard library. Even if you don't know what it does, it's easy to find info about it. I can think of only one valid usage for "using namespace", and that's if you want to import some custom namespace into another, like:
namespace SomeNamespace { using namespace AnotherNamespace; }
Any other use is horrible programming.
This is subjective. You might like it this way, but others might not. So I would not call it bad practice in any way.
EEssentia wrote:
I would also point out a few things in the code... Use typedefs for pointers! The syntax looks real messy if you do not.
That should not be taken as a general principle. Sometimes substituting pointers with typedeffed names can only make the program harder to read because it will be harder to visually distinguish between pointers and non-pointer types.
Yes, you are right in a certain sense. typedef VoidPtr void*; That is an example of bad usage of typedefs. But when it comes to function pointers that you return, it is usually a good idea, since you see how verbal the syntax became without the use of it.
Do not use function pointers - use functors!
Functors can only be passed as parameters to template functions, and consequently cannot be used in every possible situation where a function pointer is required. (Also functors are more verbose to write, at least until the next C++ standard gets ratified and implemented widely.)
Partily true, I suppose. Functors usually do require templates, and often it is possible to use templates. In this example, it certainly was. Then there is the std::function or whatever it is called. I believe this one can encapsulate a functor so long as you specify the types the functor must have. Which is pretty much the same as the member function pointer, but probably faster.
Furthermore, stripping the names of parameters in declarations is bad. Bad, bad,
It's not always bad. For example, if you have eg. a member function named "setXCoordinate(int)" or "setModifiedFlag(bool)", those don't really require the parameters to be named. They are rather self-evident even without.
I cannot say that is a good argument. It obfuscates the code. Do not dwell on whether it is good or bad - just do it. Always.
(Also many people leave parameters of private function declarations unnamed to reduce the visual clutter of header files.)
This may or may not be a good idea. It all depends on what they are used for. But you are right that it might be okay.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
arflech wrote:
It is not a good idea to put a using declaration in the global scope, like right after the include directives, because it is then too easy for namespaces to clash.
I would not call it a bad idea, or not a good idea. You just have to be aware that it can clash. I would also point out a few things in the code... Use typedefs for pointers! The syntax looks real messy if you do not. Do not use function pointers - use functors! Functors are actually faster than function pointers, which would make the C++ example faster. Furthermore, stripping the names of parameters in declarations is bad. Bad, bad, bad. The C version is not much better because they lack descriptive names.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
Bisqwit did something that is not supported by the C standard. It is something that is OS-specific stuff. To be able to read and copy the actual code in memory is something the OS deals with. C does not care the slightest, so functions cannot be copied. Period.
Experienced Forum User
Joined: 4/13/2009
Posts: 431
It is true that references are less powerful than pointers, but they were never meant to replace pointers, only to exist alongside them to simplify certain things.
1 2
15 16 17 18