Gokul
(one of many) why are my bitwise operations not working?
private static char[,] board = new char[3, 3];
This is how you declare the board array
And method to check if there is winner or not can be done like this
static bool CheckWinner()
{
// Check rows, columns, and diagonals
for (int i = 0; i < 3; i++)
{
if (board[i, 0] == currentPlayer && board[i, 1] == currentPlayer && board[i, 2] == currentPlayer) return true;
if (board[0, i] == currentPlayer && board[1, i] == currentPlayer && board[2, i] == currentPlayer) return true;
}
if (board[0, 0] == currentPlayer && board[1, 1] == currentPlayer && board[2, 2] == currentPlayer) return true;
if (board[0, 2] == currentPlayer && board[1, 1] == currentPlayer && board[2, 0] == currentPlayer) return true;
return false;
}
Hope from this you can fill in rest of the logic
71 replies
(one of many) why are my bitwise operations not working?
for this you wont slice it , for a tictactoe project you would need 2D array to make the logic much simpler, and have a seperate function which will check all possible scenarios.
71 replies
Cache response time issue
Check this article on how they restructured the data to save memory , it applies for all caching software : https://instagram-engineering.com/storing-hundreds-of-millions-of-simple-key-value-pairs-in-redis-1091ae80f74c
31 replies
Seeding Data Into Database
the issue is in the User class you have marked PasswordHash and PasswordSalt as required, which means during deserialization it will expect a value to be present in the json, it doesnt matter if you set the values later on inside a for loop, the error states during deserialization its unable to find PasswordHash and PasswordSalt values .
11 replies
Cache response time issue
i have worked with redis cache and the general rule of thumb is if your cache key hold data more than 2mb then something is wrong with current approach, ik new caching software can handle large set of data, but remember as per you example you are reading a table and creating a key, which i hope can and will get bigger as a result storing table record as a whole in single key is not optimal, similarly storying each row in single key is also not optimal because once the number of keys get larger, pattern based searching will take time since it has to scan a large set of keys. best way to approach this problem is the identify your data need , can you spit the data into reasonable subsets, it could be grouping them based on a type (column) etc.
31 replies
Cache response time issue
i have faced a similar kind of issue where we where saving whole table data which later on we would fetch and process it before returning as response, best thing to do is not store list of huge objects in any caching later, and instead of saving entire tables in cache understand the data and how it could be restructured in a way it reduces the key size. i would love to help you come up with a solution , if its a personal project i can take a look at it in github
31 replies
Trouble enabling CORS
I am not familiar with react but i can notice one missing code in your cors setup update below line with .AllowAnyHeader()
options.AddPolicy(name: MyAllowSpecificOrigin,
policy =>
{
policy.WithOrigins("http://localhost:5173").AllowAnyMethod().AllowAnyHeader();
});
reason for doing this is sometime in your request you will be passing few extra headers which might be required example, Authorization header etc.
Hope it helps
3 replies
(one of many) why are my bitwise operations not working?
Basically what it says is your int array is nullable, meaning it will hold null reference which could throw an exception if you try to do any operation without initialising it. One way to fix is to declare a constructor which gives the default value of int array or you could declare it nullable like public int[]? Board
71 replies
Help me understand how you would approach this problem.
There is no problem with showing the graph data in UI screen, the problem is when we start migrating almost 90% of the Master DB content into Different DB under specific Client ID, these data migration must happen with in a transaction and the volume of data which is getting migrated is huge ,which is causing out of memory exception and api is way too slow
15 replies