About access violation on WriteProcessMemory()

So, I made a simple memory scanner that you can see right here: https://github.com/nyeecola/lovscan/

It actually works fine and I'm very happy with it, but there's one thing I can't quite understand yet.
Maybe some of you guys that are more familiar with the Win32 API can help me a little bit?

Here's the issue:
- I call WriteProcessMemory to a region retrieved by VirtualQueryEx and it works just fine, but sometimes when I close the program it hangs the application with the "stopped responding" message.
- It only really happens when I close the program
- It actually is related to the WriteProcessMemory call, because it only happens to some memory locations and not to others. (this is actually reproducible)

My guess:
- Maybe it hangs because it is writing to a memory location that isn't supposed to be written to.
- If that is true, why does it only hang after the program is done running?
- Also, why does it still write to it anyway? (I can see this by calling ReadProcessMemory on the same address)
- And one last thing: Why would it hang my program and not the one I'm writing to? I mean, it would make sense to hang the program whose memory is being written, right? Why is this not the case here?

Edited by Italo on
If your program crashes somewhere outside of WriteProcessMemory call, then this is a bug in your code. If it crashes somewhere outside "main" function, then maybe you are corrupting stack or heap of C runtime.

Check the callstack in debugger when your program crashes.

Looking at your code I can say that ConvertAsciiToUtf8 and ConvertUtf8ToAscii functions does something strange.
First of all - imgui expects all strings to be utf8. But judging by name of functions you are trying to convert utf8 to ascii when passing string to inmgui, and ascii to utf8 when getting string from imgui. But even then - your conversion founctions does completely different thing. Your ConvertAsciiToUtf8 function converts utf-8 string to utf-16. And your ConvertUtf8ToAscii function converts utf-16 string to system specific 8-bit locale (not utf-8).

Not sure if these functions are "the bug", but they are definitely wrong or are called wrong.

Your ConvertAsciiToUtf8 function needs to be called ConvertUtf8toUtf16 and it should look like this:
1
2
3
4
5
6
7
8
// assumes Utf8 input is zero terminated
// maxbytes is how many bytes is allowed to write in Utf16 buffer
int ConvertUtf8toUtf16 (char *Utf8, char *Utf16, int maxbytes) {
    int MaxUtf16Count = maxbytes / sizeof(WCHAR); // max length of Utf16 codepoints
    // -1 for length of Utf8, because it is 0 terminated, no need for strlen
    int LenInWchar = MultiByteToWideChar(CP_UTF8, 0, Utf8, -1, (LPWSTR)Utf16, MaxUtf16Count);
    return LenInWchar * sizeof(WCHAR); // return written bytes into Utf16 buffer
}


And your ConvertUtf8ToAscii function needs to be called ConvertUtf16toUtf8 and it should look like this:
1
2
3
4
5
6
7
8
// Utf16 input does not need to be 0 terminated, pass length of it in bytes
// maxbytes is how many bytes is allowed to write in Utf8 buffer, it will always 0 terminate
int ConvertUtf16toUtf8 (char *Utf16, int LenInBytes, char *Utf8, int maxbytes) {
    int Utf16Count = LenInBytes / sizeof(WCHAR); // length of Utf16 string in utf16 codepoints (not count of chars)
    int ByteCount = WideCharToMultiByte(CP_UTF8, (LPWSTR) Utf16, Utf16Count, Utf8, maxbytes-1, 0, NULL);
    Utf8[ByteCount] = 0; // always 0 terminate
    return ByteCount; // return length of written Utf8 string in bytes (this is not count of chars)
}


Edited by Mārtiņš Možeiko on