Msg # |
User |
Message |
Date |
1
|
Jay G (Disabled)
|
I "know it when I see it" quality. Something that technically is Hi-Def, but is not crisp and clear is not real hi-def to me. Bad photography is bad photography no matter what equipment is used to film it.
|
02-01-14 02:34am
Reply To Message
|
2
|
Drooler (Disabled)
|
Dimensions usually do it for me, but as JayG pointed out, only seeing is believing. 1.5 GB of "hi-def" dumb camera work, dim lighting, and dodgy editing will diss the damn thing pretty quick.
|
02-01-14 03:26am
Reply To Message
|
3
|
jberryl69 (Disabled)
|
^ & ^^
|
02-01-14 07:06am
Reply To Message
|
4
|
pat362 (0)
|
Screen resolution is what I use to determine if it's HD or not.
|
02-01-14 09:43am
Reply To Message
|
5
|
Monahan (0)
|
REPLY TO #4 - pat362 :
Ditto pat362.
|
02-01-14 11:19am
Reply To Message
|
6
|
Parsnip (0)
|
If it looks good, it's HD-enough
|
02-02-14 08:45am
Reply To Message
|
7
|
BobA (0)
|
Usually, it's a combination of dimensions, bitrate, and the original source capture (camera) being in high definition. Usually with those 3 you will get a "Hi-Def" picture.
You can pump up the dimensions and bitrate of a crappy source (like an old VHS tape), despite a big file, it won't be real hi-def cause the source isn't hi-def.
Or, you can have a hi-def capture, and high dimensions, but if the bitrate is low, you'll have blocky/blurry/aliased picture - won't look good.
All three are important for a good hi-def picture... but typically with higher bit rates and resolution, you get a increasingly large file. Many productions will say 1080p... but with a lower bit rate, it won't look as good as it could.
|
02-04-14 09:56am
Reply To Message
|
8
|
Tree Rodent (0)
|
I put "other" because for me it's a mix of dimensions and bit rate. I suppose we all know what it looks like, it's a case of how do we define it. I agree with BobA about the original source as well.
|
02-04-14 06:39pm
Reply To Message
|