1.15.2 Limit the amount of an specific block per chunk.

Discussion in 'Spigot Plugin Development' started by Xuho, Jan 25, 2020.

  1. The easiest way of doing this is iterating throught all the blocks of the chunk like:
    Code (Text):
      public int count(Chunk chunk, Material mat){
        int count = 0;
        for (int x = 0; x < 16; x++) {
          for (int z = 0; z < 16; z++) {
            for (int y = 0; y < 256; y++) {
              if (chunk.getBlock(x, y, z).getType() == mat) {
                count++;
              }
            }
          }
        }
        return count;
      }
    But I'm wondering if there is any fastest way of doing this.
     
  2. You can log the amounts in each chunk into an SQL database, and when something adds a new block you check and update the database.
     
  3. You'd probably also want an in-memory cache otherwise you're going to cause a lot of database traffic.
     
  4. That goes without saying.
     
  5. I want to avoid using databases if its possible, any ideas?
     
  6. Just store it all in memory then. Unless you plan to have a lot of players then it'll serve you just fine.
     
  7. When populating the cache for the first time for a chunk, you might want to consider taking a chunk snapshot and looping through all blocks asynchronously if it takes less than doing the computation on the same thread.
     
  8. FrostedSnowman

    Resource Staff

  9. Database is good because then you can persist the counts throughout restart otherwise you have to rebuild it fresh. It's also good because you can save unused chunk data to the database to free up cache space/memory
     
  10. FrostedSnowman

    Resource Staff

    There’s no need to persist anything, in this case. You can just count the amount of blocks in a chunk. The method I directed the OP to will give you results in <=1ms
     
  11. There's a need to persist if you want better performance and scalability. This could greatly effect a server when large TNT explosions or WorldEdit operations happen. If you were to break every block in a chunk you'd have to iterate 65,000 times versus just once with a cache.
     
  12. ?
     
  13. But when you reach a big amount of chunks in your database depending on the position of that chunk in the database, it would be better to just iterate throught all blocks right?
     
  14. Not if you search using binary search in the database, you can make it quite efficient.
     
  15. FrostedSnowman

    Resource Staff

    The methods are entirely different
     
  16. Code (Text):
        public static int[] getMaterialAmount(final Chunk chunk) {
            final int[] amount = new int[Material.values().length];
            final int minX = chunk.getX() << 4;
            final int minZ = chunk.getZ() << 4;
            final int maxX = minX | 15;
            final int maxY = chunk.getWorld().getMaxHeight();
            final int maxZ = minZ | 15;

            for (int x = minX; x <= maxX; ++x) {
                for (int y = 0; y <= maxY; ++y) {
                    for (int z = minZ; z <= maxZ; ++z) {
                        ++amount[chunk.getBlock(x, y, z).getType().ordinal()];
                    }
                }
            }
            return amount;
        }
    If you refer to this code, is mostly the same but it returns the amount of all the blocks of the chunk.
    If you refer to the other examples, they wont work in my case becouse they are taking advantage in the fact that spawners have an attached TileEntity so they have to iterate only throught the entities at that chunk.
     
  17. FrostedSnowman

    Resource Staff

    Isnt that what you want? To count the blocks so you can determine if you want to limit them? That’s a fast way to do it.
     
  18. As I just want to get the amount of an specific block type in a chunk, I dont see why that code would be better than the example I gave at the start of the post.
     
  19. There will be a break even point. If the DB query takes longer than iterating over the blocks then yeah (which is a very real possibility) then it's not worth implementing. You'd have to profile the time it takes to do the DB query vs to just iterate. The important thing is to have an in-memory cache so that hot chunks don't repeatedly iterate. Using something like a weak hashmap may be an effective way to evict entries
     
  20. FrostedSnowman

    Resource Staff

    Benchmark, see which is faster, use that.