>For modern programs, you should #include <stdint.h> then use standard types.
>
>For more details, see the stdint.h specification.
>
>The common standard types are:
>
>int8_t, int16_t, int32_t, int64_t — signed integers
>uint8_t, uint16_t, uint32_t, uint64_t — unsigned integers
>float — standard 32-bit floating point
>double - standard 64-bit floating point
>Notice we don't have char anymore. char is actually misnamed and misused in C.
>
>Developers routinely abuse char to mean "byte" even when they are doing unsigned byte manipulations. It's much cleaner to use uint8_t to mean single a unsigned-byte/octet-value and uint8_t * to mean sequence-of-unsigned-byte/octet-values.
https://matt.sh/howto-c
Are you fucking kidding me?! So I should make my code as obtuse and hard to read as possible just because of the NON-issue of different platforms using different sizes for int, float, etc.
Unless you're working a big data sets I fail to see the reason for any of this.
Various other bloggers / HN trannies already refuted this article.
I still like him though, he seems like a cool homosexual.
>>57633946
You don't happen to have links to the articles by some of these gender fluid earthkin?
>>57633763
however it's true for 32 vs 64 bit
>>57633763
>int8_t, int16_t, int32_t, int64_t — signed integers
>uint8_t, uint16_t, uint32_t, uint64_t — unsigned integers
Aren't these unportable? If the processor can't address these sizes for whatever reason, it won't compile
char is better because it is always exactly one byte, no matter the processor (even if a byte is 10 bits, for example)
*int_leastx_t or *int_fastx_t better, as it will always store at least x bits
At least, that's how I understand it
>>57634192
>char is better because it is always exactly one byte, no matter the processor (even if a byte is 10 bits, for example)
There are basically no systems anymore where a byte is not 8 bit and those where this is the case do not support "char" at all
>>57633763#define int int32_t
#define char uint8_t
>>57633763
>https://matt.sh/howto-c
>The first rule of C is don't write C if you can avoid it.
Stopped reading at that line
C U C K E D
U C K E D C
C K E D C U
K E D C U C
E D C U C K
D C U C K E
>>57634151
I thought he linked to them in the article?
If not jfgi it's not hard to find.
>>57634307
Seriously, this is common for <10000 line programs?